Algorithms for Hyper-Parameter Optimization

@inproceedings{Bergstra2011AlgorithmsFH,
  title={Algorithms for Hyper-Parameter Optimization},
  author={James Bergstra and R{\'e}mi Bardenet and Yoshua Bengio and Bal{\'a}zs K{\'e}gl},
  booktitle={NIPS},
  year={2011}
}
Several recent advances to the state of the art in image classification benchmarks have come from better configurations of existing techniques rather than novel approaches to feature learning. Traditionally, hyper-parameter optimization has been the job of humans because they can be very efficient in regimes where only a few trials are possible. Presently, computer clusters and GPU processors make it possible to run more trials and we show that algorithmic approaches can find better results. We… CONTINUE READING
Highly Influential
This paper has highly influenced 66 other papers. REVIEW HIGHLY INFLUENTIAL CITATIONS
Highly Cited
This paper has 934 citations. REVIEW CITATIONS

Citations

Publications citing this paper.
Showing 1-10 of 484 extracted citations

934 Citations

0100200300'12'14'16'18
Citations per Year
Semantic Scholar estimates that this publication has 934 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
Showing 1-10 of 21 references

An analysis of single-layer networks in unsupervised feature learning

  • A. Coates, H. Lee, A. Ng
  • NIPS Deep Learning and Unsupervised Feature…
  • 2010
1 Excerpt

Gaussian process optimization in the bandit setting: No regret and experimental design

  • N. Srinivas, A. Krause, S. Kakade, M. Seeger
  • In ICML,
  • 2010
1 Excerpt

Theano: a CPU and GPU math expression compiler

  • J. Bergstra, O. Breuleux, +5 authors Y. Bengio
  • In Proceedings of the Python for Scientific…
  • 2010
1 Excerpt

Similar Papers

Loading similar papers…