• Computer Science, Mathematics
  • Published in NIPS 2012

Practical Bayesian Optimization of Machine Learning Algorithms

@inproceedings{Snoek2012PracticalBO,
  title={Practical Bayesian Optimization of Machine Learning Algorithms},
  author={Jasper Snoek and Hugo Larochelle and Ryan P. Adams},
  booktitle={NIPS},
  year={2012}
}
The use of machine learning algorithms frequently involves careful tuning of learning parameters and model hyperparameters. Unfortunately, this tuning is often a "black art" requiring expert experience, rules of thumb, or sometimes brute-force search. There is therefore great appeal for automatic approaches that can optimize the performance of any given learning algorithm to the problem at hand. In this work, we consider this problem through the framework of Bayesian optimization, in which a… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 2,142 CITATIONS

Gradient boosting in automatic machine learning: feature selection and hyperparameter optimization

VIEW 38 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

cient Bayesian Optimization for Target Vector Estimation

VIEW 18 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Adapting Swarm Applications: A Systematic and Quantitative Approach

VIEW 10 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

Ease.ml in Action: Towards Multi-tenant Declarative Learning Services

VIEW 4 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

Open Loop Hyperparameter Optimization and Determinantal Point Processes

VIEW 10 EXCERPTS
CITES METHODS & RESULTS
HIGHLY INFLUENCED

Thesis proposal Modeling Diversity in the Machine Learning Pipeline

VIEW 12 EXCERPTS
CITES METHODS & RESULTS
HIGHLY INFLUENCED

Tuning for Tissue Image Segmentation Workflows for Accuracy and Performance

VIEW 12 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Active Search with Complex Actions and Rewards

VIEW 8 EXCERPTS
CITES BACKGROUND
HIGHLY INFLUENCED

FILTER CITATIONS BY YEAR

2010
2020

CITATION STATISTICS

  • 265 Highly Influenced Citations

  • Averaged 506 Citations per year from 2017 through 2019

  • 13% Increase in citations per year in 2019 over 2018

References

Publications referenced by this paper.
SHOWING 1-10 OF 16 REFERENCES

Gaussian Process Optimization with Mutual Information

VIEW 5 EXCERPTS
HIGHLY INFLUENTIAL

Online Learning for Latent Dirichlet Allocation

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Learning structural SVMs with latent variables

VIEW 8 EXCERPTS
HIGHLY INFLUENTIAL

Bergstra , Rémi Bardenet , Yoshua Bengio , and Bálázs Kégl . Algorithms for hyper - parameter optimization

  • Eric Brochu, M Vlad
  • 2011

Andrew Saxe, Pang Wei Koh, Zhenghao Chen, Maneesh Bhand, Bipin Suresh, and Andrew Ng

  • NIPS.
  • On random
  • 2010

Cora , and Nando de Freitas . A tutorial on Bayesian optimization of expensive cost functions , with application to active user modeling and hierarchical reinforcement learning

  • E Carl
  • Rasmussen and Christopher Williams . Gaussian Processes for Machine Learning
  • 2010