• Corpus ID: 14136770

Bayesian Optimization in High Dimensions via Random Embeddings

@inproceedings{Wang2013BayesianOI,
  title={Bayesian Optimization in High Dimensions via Random Embeddings},
  author={Ziyun Wang and Masrour Zoghi and Frank Hutter and David Matheson and Nando de Freitas},
  booktitle={IJCAI},
  year={2013}
}
Bayesian optimization techniques have been successfully applied to robotics, planning, sensor placement, recommendation, advertising, intelligent user interfaces and automatic algorithm configuration. [] Key Method The resulting Random EMbedding Bayesian Optimization (REMBO) algorithm is very simple and applies to domains with both categorical and continuous variables. The experiments demonstrate that REMBO can effectively solve high-dimensional problems, including automatic parameter configuration of a…

Figures and Tables from this paper

High-Dimensional Bayesian Optimization via Additive Models with Overlapping Groups
TLDR
This paper significantly generalizes the approach of Kandasamy et al. (2015), in which the high-dimensional function decomposes as a sum of lower-dimensional functions on subsets of the underlying variables, by representing the dependencies via a graph and deducing an efficient message passing algorithm for optimizing the acquisition function.
Bayesian Optimization in a Billion Dimensions via Random Embeddings
TLDR
Empirical results confirm that REMBO can effectively solve problems with billions of dimensions, provided the intrinsic dimensionality is low, and show thatREMBO achieves state-of-the-art performance in optimizing the 47 discrete parameters of a popular mixed integer linear programming solver.
High Dimensional Bayesian Optimization via Supervised Dimension Reduction
TLDR
This paper directly introduces a supervised dimension reduction method, Sliced Inverse Regression (SIR), to high dimensional Bayesian optimization, which could effectively learn the intrinsic sub-structure of objective function during the optimization.
SCALING BAYESIAN OPTIMIZATION UP TO HIGHER DIMENSIONS: A REVIEW AND COMPARISON OF RECENT ALGORITHMS
  • Benoît Choffin, N. Ueda
  • Computer Science
    2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP)
  • 2018
TLDR
This paper experimentally compare three selected high-dimensional Bayesian optimization algorithms to random search on diverse high- dimensional functions and suggests that no algorithm consistently outperforms the others across all types of difficulties encountered.
High Dimensional Bayesian Optimization using Dropout
TLDR
This work proposes a new method for high-dimensional Bayesian optimization, that uses a dropout strategy to optimize only a subset of variables at each iteration, and derives theoretical bounds for the regret and shows how it can inform the derivation of the algorithm.
Bayesian Optimization with Unknown Constraints
TLDR
This paper studies Bayesian optimization for constrained problems in the general case that noise may be present in the constraint functions, and the objective and constraints may be evaluated independently.
Constrained Bayesian Optimization and Applications
TLDR
This thesis considers Bayesian optimization in the presence of black-box constraints, and presents Predictive Entropy Search with Constraints (PESC), a highly effective and sufficiently flexible method to address all problems in the general class of decoupled problems without any ad hoc modifications.
High-Dimensional Bayesian Optimization with Manifold Gaussian Processes
TLDR
This work proposes a high-dimensional BO method that learns a nonlinear low-dimensional manifold of the input space with a multi-layer neural network embedded in the covariance function of a Gaussian process that outperforms recent baselines in high- dimensional BO literature on a set of benchmark functions in 60 dimensions.
High-Dimensional Bayesian Optimization via Tree-Structured Additive Models
TLDR
This paper considers generalized additive models in which low-dimensional functions with overlapping subsets of variables are composed to model a high-dimensional target function and proposes a hybrid graph learning algorithm based on Gibbs sampling and mutation to facilitate both structure learning and optimization of the acquisition function.
Derivative-Free Optimization via Classification
TLDR
This paper proposes the randomized coordinate shrinking classification algorithm to learn the model, forming the RACOS algorithm, for optimization in continuous and discrete domains, and proves that optimization problems with Local Lipschitz continuity can be solved in polynomial time by proper configurations of this framework.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Gaussian Processes for Global Optimization
We introduce a novel Bayesian approach to global optimization using Gaussian processes. We frame the optimization of both noisy and noiseless functions as sequential decision problems, and introduce
A Bayesian exploration-exploitation approach for optimal online sensing and planning with a visually guided mobile robot
TLDR
A Bayesian optimization method that dynamically trades off exploration and exploitation for optimal sensing with a mobile robot and is applicable to other closely-related domains, including active vision, sequential experimental design, dynamic sensing and calibration with mobile sensors.
A Tutorial on Bayesian Optimization of Expensive Cost Functions, with Application to Active User Modeling and Hierarchical Reinforcement Learning
TLDR
A tutorial on Bayesian optimization, a method of finding the maximum of expensive cost functions using the Bayesian technique of setting a prior over the objective function and combining it with evidence to get a posterior function.
Practical bayesian optimization
TLDR
This work examines the last of the Bayesian response-surface approach to global optimization, which maintains a posterior model of the function being optimized by combining a prior over functions with accumulating function evaluations.
Gaussian Process Optimization in the Bandit Setting: No Regret and Experimental Design
TLDR
This work analyzes GP-UCB, an intuitive upper-confidence based algorithm, and bound its cumulative regret in terms of maximal information gain, establishing a novel connection between GP optimization and experimental design and obtaining explicit sublinear regret bounds for many commonly used covariance functions.
Joint Optimization and Variable Selection of High-dimensional Gaussian Processes
TLDR
This paper modeling the unknown function as a sample from a high-dimensional Gaussian process (GP) distribution shows that it is possible to perform joint variable selection and GP optimization and provides strong performance guarantees for the algorithm.
Hybrid Batch Bayesian Optimization
TLDR
This work systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provides a hybrid algorithm that, based on the current state, dynamically switches between a sequential policy and a batch policy with variable batch sizes.
Automated configuration of algorithms for solving hard computational problems
TLDR
This thesis studies the automation of this important part of algorithm design: the configuration of discrete algorithm components and their continuous parameters to construct an algorithm with desirable empirical performance characteristics and introduces data-driven approaches for making these choices adaptively.
Sequential Model-Based Optimization for General Algorithm Configuration
TLDR
This paper extends the explicit regression models paradigm for the first time to general algorithm configuration problems, allowing many categorical parameters and optimization for sets of instances, and yields state-of-the-art performance.
Practical Bayesian Optimization of Machine Learning Algorithms
TLDR
This work describes new algorithms that take into account the variable cost of learning algorithm experiments and that can leverage the presence of multiple cores for parallel experimentation and shows that these proposed algorithms improve on previous automatic procedures and can reach or surpass human expert-level optimization for many algorithms.
...
1
2
3
4
...