A comparison of mixed-variables Bayesian optimization approaches

@article{CuestaRamirez2021ACO,
  title={A comparison of mixed-variables Bayesian optimization approaches},
  author={Jhouben Cuesta-Ramirez and Rodolphe Le Riche and Olivier Roustant and Guillaume Perrin and C{\'e}dric Durantin and Alain Gli{\`e}re},
  journal={Advanced Modeling and Simulation in Engineering Sciences},
  year={2021},
  volume={9},
  pages={1-29}
}
Most real optimization problems are defined over a mixed search space where the variables are both discrete and continuous. In engineering applications, the objective function is typically calculated with a numerically costly black-box simulation. General mixed and costly optimization problems are therefore of a great practical interest, yet their resolution remains in a large part an open scientific question. In this article, costly mixed problems are approached through Gaussian processes… 

A General Mathematical Framework for Constrained Mixed-variable Blackbox Optimization Problems with Meta and Categorical Variables

A mathematical framework for modelling constrained mixed-variable optimization problems is presented in a blackbox optimization context, which introduces a new notation and allows solution strategies and facilitates the solution of such problems.

BaCO: A Fast and Portable Bayesian Compiler Optimization Framework

The Bayesian Compiler Optimization framework (BaCO) is introduced, a general purpose autotuner for modern compilers targeting CPUs, GPUs, and FPGAs and outperforms current state-of-the-art autotuners for these domains.

Stochastic efficient global optimization with high noise variance and mixed design variables

The stochastic efficient global optimization (SEGO) method is extended and two additional stopping criteria for the Monte Carlo integration that is required to approximate the objective function are proposed.

A mixed-categorical correlation kernel for Gaussian process

This paper presents a kernel-based approach that extends continuous exponential kernels to handle mixed-categorical variables and leads to a new GP surrogate that generalizes both the continuous relaxation and the Gower distance based GP models.

Multiobjective Tree-Structured Parzen Estimator

It is demonstrated that MOTPE approximates the Pareto fronts of a variety of benchmark problems and a convolutional neural network design problem better than existing methods through the numerical results.

References

SHOWING 1-10 OF 35 REFERENCES

Multiplier and gradient methods

The main purpose of this paper is to suggest a method for finding the minimum of a functionf(x) subject to the constraintg(x)=0, which consists of replacingf byF=f+λg+1/2cg2, and computing the appropriate value of the Lagrange multiplier.

Lagrange Multipliers and Optimality

Lagrange multipliers are now being seen as arising from a general rule for the subdifferentiation of a nonsmooth objective function which allows black-and-white constraints to be replaced by penalty expressions.

Numerical Optimization (Springer Series in Operations Research and Financial Engineering)

Numerical optimization presents a graduate text, in continuous presents, that talks extensively about algorithmic performance and thinking, and about mathematical optimization in understanding of initiative.

Mixed Integer Evolution Strategies for Parameter Optimization

MIES can deal with parameter vectors consisting not only of continuous variables but also with nominal discrete and integer variables and it is shown that with proper constraint handling techniques, MIES can also be applied to classical mixed integer nonlinear programming problems.

Sequential Model-Based Optimization for General Algorithm Configuration

This paper extends the explicit regression models paradigm for the first time to general algorithm configuration problems, allowing many categorical parameters and optimization for sets of instances, and yields state-of-the-art performance.

Estimation Distribution Algorithm for mixed continuous-discrete optimization problems

Some disadvantages of present probabilistic models used in EDAs are identified and a more general and efficient model for continuous optimization problems based on the decision trees is proposed.

Mathematical Programming: Theory and Algorithms

The solution of Large-scale Programming Problems: Generalized Linear Programming and Decomposition Techniques Dynamic Programming Optimization in Infinite Dimension and Applications is presented.

Model-based methods for continuous and discrete global optimization

Group Kernels for Gaussian Process Metamodels with Categorical Inputs

This paper exploits the hierarchy group/level and provides a parameterization of valid block matrices T, based on a nested Bayesian linear model, giving a flexible parametric family of valid covariance matrices with constant covariances between pairs of blocks.

Revisiting Bayesian optimization in the light of the COCO benchmark

It is found that a small initial budget, a quadratic trend, high-quality optimization of the acquisition criterion bring consistent progress, and the best EGO variants are competitive or improve over state-of-the-art algorithms in dimensions less or equal to 5 for multimodal functions.