Corpus ID: 201070132

Towards Assessing the Impact of Bayesian Optimization's Own Hyperparameters

@article{Lindauer2019TowardsAT,
  title={Towards Assessing the Impact of Bayesian Optimization's Own Hyperparameters},
  author={M. Lindauer and M. Feurer and Katharina Eggensperger and Andr{\'e} Biedenkapp and F. Hutter},
  journal={ArXiv},
  year={2019},
  volume={abs/1908.06674}
}
  • M. Lindauer, M. Feurer, +2 authors F. Hutter
  • Published 2019
  • Mathematics, Computer Science
  • ArXiv
  • Bayesian Optimization (BO) is a common approach for hyperparameter optimization (HPO) in automated machine learning. Although it is well-accepted that HPO is crucial to obtain well-performing machine learning models, tuning BO's own hyperparameters is often neglected. In this paper, we empirically study the impact of optimizing BO's own hyperparameters and the transferability of the found settings using a wide range of benchmarks, including artificial functions, HPO and HPO combined with neural… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 53 REFERENCES
    Bayesian optimization for conditional hyperparameter spaces
    17
    Hyperopt: A Python Library for Optimizing the Hyperparameters of Machine Learning Algorithms
    290
    Bayesian Optimization with Robust Bayesian Neural Networks
    169
    Efficient Benchmarking of Hyperparameter Optimizers via Surrogates
    68
    Practical Bayesian Optimization of Machine Learning Algorithms
    3194
    Tunability: Importance of Hyperparameters of Machine Learning Algorithms
    70
    Auto-WEKA: combined selection and hyperparameter optimization of classification algorithms
    735
    BOHB: Robust and Efficient Hyperparameter Optimization at Scale
    199
    Automatic Exploration of Machine Learning Experiments on OpenML
    5