Decision-makers often strive to minimize the cost or maximize the performance of a system that depends on many decision variables. If the decision-maker can quantify the cost as a function of the decision variables, then computational methods can be used to obtain or approximate the optimal decision. For complicated cost functions arising in practice it may not be possible to know for sure that a proposed solution is optimal and one must settle for an approximate solution. Typical examples of such problems include choosing well sites and pumping rates for ground water pollution remediation, aligning medical images taken at different times, and determining the configuration of a collection of atoms that minimizes the potential energy. This award supports research into methods for solving such optimization problems and characterizing their inherent difficulty as the number of decision variables increases. These methods will be applicable to a broad range of problems in engineering, science, and industry.The optimization problems described above are called global optimization problems. It is well-known that global optimization is intractable in high dimensions in the worst-case complexity setting. The investigator will determine if continuous optimization is tractable in an asymptotic or average-case setting by establishing both upper and lower complexity bounds. The investigator will obtain upper complexity bounds by devising new optimization algorithms and proving their convergence rates. The project will use two approaches to algorithm design. One approach is to subdivide the domain into polytopes, and choose new function evaluation points within the polytope that maximize a criterion based on the size of the polytope and the observed function values at its vertices. The other approach is to use randomized point selection schemes that aim to obtain comparable results to the first approach, on average, but without the computational cost of maintaining the polyhedral subdivisions. The lower complexity bounds will establish the smallest error that can be obtained with any algorithm that uses a given average number of function evaluations. A key question that this research will attempt to answer is whether the lower complexity bounds grow exponentially with the dimension.
|Effective start/end date||8/15/16 → 7/31/19|
- National Science Foundation
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.