• Corpus ID: 227210601

Combinatorial Bayesian Optimization with Random Mapping Functions to Convex Polytope

  title={Combinatorial Bayesian Optimization with Random Mapping Functions to Convex Polytope},
  author={Jungtaek Kim and Minsu Cho and Seungjin Choi},
  booktitle={Conference on Uncertainty in Artificial Intelligence},
Bayesian optimization is a popular method for solving the problem of global optimization of an expensive-to-evaluate black-box function. It relies on a probabilistic surrogate model of the objective function, upon which an acquisition function is built to determine where next to evaluate the objective function. In general, Bayesian optimization with Gaussian process regression operates on a continuous space. When input variables are categorical or discrete, an extra care is needed. A common… 

Figures from this paper

Recent Advances in Bayesian Optimization

This paper attempts to provide a comprehensive and updated survey of recent advances in Bayesian optimization that are mainly based on Gaussian processes and identify challenging open problems.



Mercer Features for Efficient Combinatorial Bayesian Optimization

The key idea behind MerCBO is to provide explicit feature maps for diffusion kernels over discrete objects by exploiting the structure of their combinatorial graph representation in Mercer Features for Combinatorial Bayesian Optimization.

Combinatorial Bayesian Optimization using the Graph Cartesian Product

Combinatorial Bayesian Optimization outperforms consistently the latest state-of-the-art while maintaining computational and statistical efficiency and is validated in a wide array of real- istic benchmarks, including weighted maximum satisfiability problems and neural architecture search.

Combinatorial Black-Box Optimization with Expert Advice

A computationally efficient model learning algorithm based on multilinear polynomials and exponential weight updates that improves the computational time up to several orders of magnitude compared to state-of-the-art algorithms in the literature.

Bayesian Optimization of Combinatorial Structures

This article proposes an adaptive, scalable model that identifies useful combinatorial structure even when data is scarce, and pioneers the use of semidefinite programming to achieve efficiency and scalability.

A Tutorial on Bayesian Optimization

This tutorial describes how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient, and provides a generalization of expected improvement to noisy evaluations beyond the noise-free setting where it is more commonly applied.

Bayesian optimization with approximate set kernels

A Bayesian optimization method with set kernel that is used to build surrogate functions, which uses symmetry of the feasible region that defines a set input to minimize a black-box function that takes a set as a single input.

Efficient Global Optimization of Expensive Black-Box Functions

This paper introduces the reader to a response surface methodology that is especially good at modeling the nonlinear, multimodal functions that often occur in engineering and shows how these approximating functions can be used to construct an efficient global optimization algorithm with a credible stopping rule.

Online Decision-Making in General Combinatorial Spaces

This study gives a general algorithm for low-dimensional online mirror descent (LDOMD), and offers a unification and generalization of previous work, and emphasizes the role of the convex polytope arising from the vector representation of the decision space; while Boolean representations lead to 0-1 polytopes, more general vector representation lead to more generalpolytopes.

Bayesian Optimization over Sets

A Bayesian optimization method with set kernel that accumulates similarity over set elements to enforce permutation-invariance and permit sets of variable size and is an unbiased estimator of the true set kernel is developed.

Bayesian Optimization in a Billion Dimensions via Random Embeddings

Empirical results confirm that REMBO can effectively solve problems with billions of dimensions, provided the intrinsic dimensionality is low, and show thatREMBO achieves state-of-the-art performance in optimizing the 47 discrete parameters of a popular mixed integer linear programming solver.