Repository logo
Log In

University Digital Conservancy

University Digital Conservancy

Communities & Collections
Browse
About
AboutHow to depositPolicies
Contact

Browse by Subject

  1. Home
  2. Browse by Subject

Browsing by Subject "Discrete Optimization"

Now showing 1 - 1 of 1
  • Results Per Page
  • Sort Options
  • Loading...
    Thumbnail Image
    Item
    Discrete Optimization Models and Techniques for Problems in Data-Driven Prescriptive Analytics
    (2022-05) Kim, Jongeun
    In this thesis, we consider situations when optimization problems include objective and/or constraint functions whose explicit forms are not directly available. We focus on two complementary situations, when those unknown functions are described by pretrained tree ensembles and when we train an explicit functional form using sample data points to substitute the unknown functions. We first study optimization problems where some objective and/or constraint functions are described by pre-trained tree ensembles. This problem is known as tree ensemble optimization. We establish a connection between tree ensemble optimization and multilinear optimization. We develop new polyhedral results for one problem through the lens of the other problem, and vice versa. Computational experiments show that our formulations provide tighter relaxations and improve the solution times compared to formulations based on modeling piecewise-linear functions and compared to existingtree ensemble formulations. We next study piecewise polyhedral relaxations of multilinear optimization. The developed results can be applied to model optimization problems when some objective and/or constraint functions are generalized decision trees. A generalized decision tree consists of a decision tree and a nonlinear model for each leaf. Given an input value, its prediction is computed by the nonlinear model associated with the leaf that the point corresponds to. We provide the first locally ideal formulation for the relaxation problem and further improve formulations by introducing linking constraints that help relatemodel components that are commonly treated independently in the literature. The implementation of our results inside of an open source mixed-integer nonlinear programming (MINLP) solver reduces by factor of approximately 3 the number of instances that the algorithm cannot solve within an hour with its default setting. Finally, we study symbolic regression, which is a form of regression that learns functional expressions from observational data. An expression tree is typically used to represent a solution function, which is determined by assigning operators and operands to the nodes. The methodology we develop is based on MINLP, which seeks global optimality given a restriction on the size of a solution expression tree. We develop tighter MINLP formulations and a heuristic that iteratively builds an expression tree by solving a restricted MINLP. In computational experiments that are aimed at discovering physics formula from sample data points, our methods outperform the best known approaches from the literature.

UDC Services

  • About
  • How to Deposit
  • Policies
  • Contact

Related Services

  • University Archives
  • U of M Web Archive
  • UMedia Archive
  • Copyright Services
  • Digital Library Services

Libraries

  • Hours
  • News & Events
  • Staff Directory
  • Subject Librarians
  • Vision, Mission, & Goals
University Libraries

© 2025 Regents of the University of Minnesota. All rights reserved. The University of Minnesota is an equal opportunity educator and employer.
Policy statement | Acceptable Use of IT Resources | Report web accessibility issues