Complete search in continuous global optimization and constraint satisfaction

  title={Complete search in continuous global optimization and constraint satisfaction},
  author={Arnold Neumaier},
  journal={Acta Numerica},
  pages={271 - 369}
  • A. Neumaier
  • Published 1 May 2004
  • Computer Science
  • Acta Numerica
This survey covers the state of the art of techniques for solving general-purpose constrained global optimization problems and continuous constraint satisfaction problems, with emphasis on complete techniques that provably find all solutions (if there are finitely many). The core of the material is presented in sufficient detail that the survey may serve as a text for teaching constrained global optimization. After giving motivations for and important examples of applications of global… 


This thesis proposes a new complete search technique that can find all solutions within a predetermined tolerance for numerical constraint satisfaction problems, and proposes a novel generic scheme for combining multiple inclusion techniques in numerical constraint propagation.

Constraint aggregation for rigorous global optimization

This paper shows that even when the verification of an approximate feasible point fails, the information extracted from the results of the local optimization can still be used in many cases to reduce the search space.

Review of optimization techniques

A basic overview of optimization techniques is provided. The standard form of the general non-linear, constrained optimization problem is presented, and various techniques for solving the resulting

The cluster problem in constrained global optimization

This article extends previous analyses of the cluster problem in unconstrained global optimization to the constrained setting based on a recently-developed notion of convergence order for convex relaxation-based lower bounding schemes, showing that clustering can occur both on nearly-optimal and nearly-feasible regions in the vicinity of a global minimizer.

Global optimization in reduced space

It is shown that tighter relaxations can lead to a significant reduction in the number of boxes visited, and it is further shown that branch-and-bound algorithms still possess their convergence properties.

Heuristic Global Optimization

Empirical results show that the CCGO algorithm has an edge in runtime while offering competitive solution quality, and practical applications that need a solution as quickly as possible can use CCGO to get one and optimization solvers that want to capitalize on an early incumbent solution can use it or its approximate search feature within their workflow.

Global optimization of monotonic programs: applications in polynomial and stochastic programming

A generic branch-and-bound algorithm for monotonic optimization problems that exploits the monotonicity properties inherent in the problem, and requires the solution of only linear programming subproblems and convergence proofs for the algorithm are provided.

Parallel global optimization with deterministic approaches

The objectives of the research conducted in this thesis are to construct methods which aim to reduce the effort needed to solve global optimization problems, and to investigate suitable parallel models for those methods to speed up the computation.

Local search heuristics for discrete structural optimization with expensive black-box evaluations

Two heuristics are proposed that combine local search methods and a sequential optimization method based on approximations of the implicit constraints of structural optimization problems featuring discrete variables as well as nonlinear implicit constraints which can only be evaluated through time-expensive computations.

Interval Tests and Contractors Based on Optimality Conditions for Bound-Constrained Global Optimization

A new contraction method is introduced that is designed to handle the boundary of the initial box where a minimizer may not be a stationary point and it is shown that it subsumes the classical monotonicity test based on interval arithmetic.



Stochastic Global Optimization: Problem Classes and Solution Techniques

A classification of essentially unconstrained global optimization problems into unimodal, easy, moderately difficult, and difficult problems is proposed to remedy the lack of a representative set of test problems for comparing global optimization methods.

Developments in Global Optimization

This paper presents an algorithm for Improving the Bounding Procedure in Solving Process Network Synthesis by a B&B Method and a method for Minimizing Functions with Lipschitz Derivatives, both of which are described in more detail in the preface.

Numerical Constraint Satisfaction Problems with Non-isolated Solutions

A technique is proposed which combines the extreme vertex representation of orthogonal polyhedra 1,2,3, as defined in computational geometry, with adapted splitting strategies 4 to construct the approximations as unions of interval boxes, which allows for compacting the explicit representation of the complete solution set and improves efficiency.

Comparative Assessment of Algorithms and Software for Global Optimization

This paper reviews several prominent test collections, discusses comparison issues, and presents illustrative numerical results that will inspire a comparative study using ideas presented here.

A Review of Techniques in the Verified Solution of Constrained Global Optimization Problems

Elements and techniques of state-of-the-art automatically verified constrained global optimization algorithms are reviewed, including a description of ways of rigorously verifying feasibility for

A branch-and-reduce approach to global optimization

Valid inequalities and range contraction techniques that can be used to reduce the size of the search space of global optimization problems are presented and incorporated within the branch-and-bound framework to result in a branch- and-reduce global optimization algorithm.

A Finite Algorithm for Global Minimization of Separable Concave Programs

A new algorithm is proposed that finds the exact global minimum of this problem in a finite number of iterations and extends a guarantee of finiteness to all branch-and-bound algorithms for concave programming that (1) partition exhaustively using rectangular subdivisions and (2) branch on the incumbent solution when possible.

Global optimization using special ordered sets

Two methods for finding a global minimum of a function of a scalar variable in a finite interval, assuming that one can calculate function values and first derivatives, and also bounds on the second derivatives within any subinterval are described.