• Corpus ID: 235376868

Bayesian Optimization over Hybrid Spaces

  title={Bayesian Optimization over Hybrid Spaces},
  author={Aryan Deshwal and Syrine Belakaria and Janardhan Rao Doppa},
  booktitle={International Conference on Machine Learning},
We consider the problem of optimizing hybrid structures (mixture of discrete and continuous input variables) via expensive black-box function evaluations. This problem arises in many real-world applications. For example, in materials design optimization via lab experiments, discrete and continuous variables correspond to the presence/absence of primitive elements and their relative concentrations respectively. The key challenge is to accurately model the complex interactions between discrete… 

Figures and Tables from this paper

Combining Latent Space and Structured Kernels for Bayesian Optimization over Combinatorial Spaces

The key idea is to define a novel structure-coupled kernel that explicitly integrates the structural information from decoded structures with the learned latent space representation for better surrogate modeling.

Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization

Probabilistic reparameterization is complementary to (and benefits) recent work and naturally generalizes to settings with multiple objectives and black-box constraints and demonstrates state-of-the-art optimization performance on a wide range of real-world applications.

Bayesian Optimization over High-Dimensional Combinatorial Spaces via Dictionary-based Embeddings

This work uses Bayesian Optimization (BO) and proposes a novel surrogate modeling approach for efficiently handling a large number of binary and categorical parameters, and develops a principled approach based on binary wavelets to construct dictionaries for binary spaces and a randomized construction method that generalizes to categorical spaces.

Hybrid Models for Mixed Variables in Bayesian Optimization

A unified hybrid model using both Monte-Carlo tree search and Gaussian processes that encompasses and generalizes multiple state-of-the-art mixed BO surrogates and proposes applying a new dynamic model selection criterion among novel candidate families of covariance kernels, including non-stationary kernels and associated families.

Bayesian optimization for mixed-variable, multi-objective problems

This work presents MixMOBO, the first mixed variable, multi-objective Bayesian optimization framework for such problems, and applies it to the real-world design of an architected material, showing that the optimal design has a normalized strain energy density 10.4 times greater than existing structures.

Adaptive Experimental Design for Optimizing Combinatorial Structures

The key challenges in solving combinatorial spaces problems in the framework of Bayesian optimization (BO) and the progress over the last five years in addressing these challenges are described.

ES-ENAS: Blackbox Optimization over Hybrid Spaces via Combinatorial and Continuous Evolution

ES-ENAS is proposed, a simple and modular joint optimization procedure combining the class of sample-ecient smoothed gradient gradient techniques with combinatorial optimizers in a highly scalable and intuitive way, inspired by the one-shot or supernet paradigm introduced in Ecient Neural Architecture Search.

ES-ENAS: Efficient Evolutionary Optimization for Large Hybrid Search Spaces

ES-ENAS is proposed, a simple and modular joint optimization procedure combining the class of sample-efficient smoothed gradient techniques, commonly known as Evolutionary Strategies (ES), with combinatorial optimizers in a highly scalable and intuitive way, inspired by the one-shot or supernet paradigm introduced in Efficient Neural Architecture Search.

On the role of Model Uncertainties in Bayesian Optimization

This work provides an extensive study of the relationship between the BO performance (regret) and uncertainty calibration for popular surrogate models and compare them across both synthetic and real-world experiments, and confirms that Gaussian Processes are strong surrogate model and that they tend to outperform other popular models.

Optimization on Manifolds via Graph Gaussian Processes

This paper integrates manifold learning techniques within a Gaussian process upper confidence bound algorithm to optimize an objective function on a manifold by establishing regret bounds that ensure that when the objective 𝑓 is a sample from a squared exponential or Matérn GP on the manifold ℳ, the output of the GGP-UCB algorithm converges to the desired maximizer.



Bayesian Optimization of Composite Functions

This work proposes a novel approach that exploits the composite structure of the objective function to substantially improve sampling efficiency and provides a novel stochastic gradient estimator that allows its efficient maximization.

Scalable Combinatorial Bayesian Optimization with Tractable Statistical models

PSR approach relies on reformulation of AFO problem as submodular relaxation with some unknown parameters, which can be solved efficiently using minimum graph cut algorithms and construction of an optimization problem to estimate the unknown parameters with close approximation to the true objective.

A Tutorial on Bayesian Optimization

This tutorial describes how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient, and provides a generalization of expected improvement to noisy evaluations beyond the noise-free setting where it is more commonly applied.

Batched Large-scale Bayesian Optimization in High-dimensional Spaces

This paper proposes ensemble Bayesian optimization (EBO) to address three current challenges in BO simultaneously: large-scale observations; high dimensional input spaces; and selections of batch queries that balance quality and diversity.

Optimizing Discrete Spaces via Expensive Evaluations: A Learning to Search Framework

The main contribution is to introduce and evaluate a new learning-to-search framework for this problem called L2S-DISCO, to employ search procedures guided by control knowledge at each step to select the next structure and to improve the control knowledge as new function evaluations are observed.

Information-Theoretic Multi-Objective Bayesian Optimization with Continuous Approximations

A novel approach referred to as information-Theoretic Multi-Objective Bayesian Optimization with Continuous Approximations (iMOCA) to solve the problem of finding designs that trade-off return-time and angular distance using continuous-fidelity simulators for design evaluations.

Bayesian Calibration and Uncertainty Analysis for Computationally Expensive Models Using Optimization and Radial Basis Function Approximation

This work presents a Bayesian approach to model calibration when evaluation of the model is computationally expensive, and approximate the logarithm of the posterior density using radial basis functions and uses the resulting cheap-to-evaluate surface in MCMC.

Bayesian Optimisation over Multiple Continuous and Categorical Inputs

This work proposes a new approach, Continuous and Categorical Bayesian Optimisation (CoCaBO), which combines the strengths of multi-armed bandits and Bayesian optimisation to select values for both categorical and continuous inputs, and model this mixed-type space using a Gaussian Process kernel.

Multi-Fidelity Multi-Objective Bayesian Optimization: An Output Space Entropy Search Approach

Experiments show that MF-OSEMO, with both approximations, significantly improves over the state-of-the-art single-fidelity algorithms for multi-objective optimization.