• Corpus ID: 235417087

A Nonmyopic Approach to Cost-Constrained Bayesian Optimization

@inproceedings{Lee2021ANA,
  title={A Nonmyopic Approach to Cost-Constrained Bayesian Optimization},
  author={Eric Hans Lee and David Eriksson and Valerio Perrone and Matthias W. Seeger},
  booktitle={UAI},
  year={2021}
}
Bayesian optimization (BO) is a popular method for optimizing expensive-to-evaluate black-box functions. BO budgets are typically given in iterations, which implicitly assumes each evaluation has the same cost. In fact, in many BO applications, evaluation costs vary significantly in different regions of the search space. In hyperparameter optimization, the time spent on neural network training increases with layer size; in clinical trials, the monetary cost of drug compounds vary; and in… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 44 REFERENCES
Cost-aware Bayesian Optimization
TLDR
Cost Apportioned BO (CArBO), which attempts to minimize an objective function in as little cost as possible, and it is shown that, given the same cost budget, CArBO finds significantly better hyperparameter configurations than competing methods.
Multi-Information Source Optimization
TLDR
This work presents a novel algorithm that provides a rigorous mathematical treatment of the uncertainties arising from model discrepancies and noisy observations, and conducts an experimental evaluation that demonstrates that the method consistently outperforms other state-of-the-art techniques.
Efficient Nonmyopic Bayesian Optimization via One-Shot Multi-Step Trees
TLDR
This paper provides the first efficient implementation of general multi-step lookahead Bayesian optimization, formulated as a sequence of nested optimization problems within a multi- step scenario tree, and equivalently optimize all decision variables in the full tree jointly, in a ``one-shot'' fashion.
Lookahead Bayesian Optimization with Inequality Constraints
TLDR
This work proposes a lookahead approach that selects the next evaluation in order to maximize the long-term feasible reduction of the objective function.
A Tutorial on Bayesian Optimization
TLDR
This tutorial describes how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient, and provides a generalization of expected improvement to noisy evaluations beyond the noise-free setting where it is more commonly applied.
Cost-aware Multi-objective Bayesian optimisation
TLDR
A cost-aware multi-objective Bayesian optimisation with non-uniform evaluation cost over objective functions by defining cost- Aware constraints over the search space and formulation of the convergence that incorporates this cost- aware constraints while optimising the objective functions is formulated.
Multi-fidelity Bayesian Optimisation with Continuous Approximations
TLDR
This work develops a Bayesian optimisation method, BOCA, that achieves better regret than than strategies which ignore the approximations and outperforms several other baselines in synthetic and real experiments.
Bayesian Optimization with Resource Constraints and Production
TLDR
A novel BO problem formulation is defined that models the resources and activities needed to prepare and run experiments and a planning approach is presented, based on finite-horizon tree search, for scheduling the potentially concurrent experimental activities with the aim of best optimizing the function within a limited time horizon.
Bayesian Optimization with a Finite Budget: An Approximate Dynamic Programming Approach
TLDR
This work considers the problem of optimizing an expensive objective function when a finite budget of total evaluations is prescribed, and shows how to approximate the solution of this dynamic programming problem using rollout, and proposes rollout heuristics specifically designed for the Bayesian optimization setting.
Multi-Task Bayesian Optimization
TLDR
This paper proposes an adaptation of a recently developed acquisition function, entropy search, to the cost-sensitive, multi-task setting and demonstrates the utility of this new acquisition function by leveraging a small dataset to explore hyper-parameter settings for a large dataset.
...
1
2
3
4
5
...