Corpus ID: 221534503

Hyperparameter Optimization via Sequential Uniform Designs

@article{Yang2020HyperparameterOV,
  title={Hyperparameter Optimization via Sequential Uniform Designs},
  author={Zebin Yang and Aijun Zhang},
  journal={ArXiv},
  year={2020},
  volume={abs/2009.03586}
}
Hyperparameter tuning or optimization plays a central role in the automated machine learning (AutoML) pipeline. It is a challenging task as the response surfaces of hyperparameters are generally unknown, and the evaluation of each experiment is expensive. In this paper, we reformulate hyperparameter optimization as a kind of computer experiment and propose a novel sequential uniform design (SeqUD) for hyperparameter optimization. It is advantageous as a) it adaptively explores the… Expand

References

SHOWING 1-10 OF 55 REFERENCES
Hyperopt: a Python library for model selection and hyperparameter optimization
TLDR
An introductory tutorial on the usage of the Hyperopt library, including the description of search spaces, minimization (in serial and parallel), and the analysis of the results collected in the course of minimization. Expand
Collaborative hyperparameter tuning
TLDR
A generic method to incorporate knowledge from previous experiments when simultaneously tuning a learning algorithm on new problems at hand is proposed and is demonstrated in two experiments where it outperforms standard tuning techniques and single-problem surrogate-based optimization. Expand
Multi-Task Bayesian Optimization
TLDR
This paper proposes an adaptation of a recently developed acquisition function, entropy search, to the cost-sensitive, multi-task setting and demonstrates the utility of this new acquisition function by leveraging a small dataset to explore hyper-parameter settings for a large dataset. Expand
Practical Bayesian Optimization of Machine Learning Algorithms
TLDR
This work describes new algorithms that take into account the variable cost of learning algorithm experiments and that can leverage the presence of multiple cores for parallel experimentation and shows that these proposed algorithms improve on previous automatic procedures and can reach or surpass human expert-level optimization for many algorithms. Expand
Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization
TLDR
A novel algorithm is introduced, Hyperband, for hyperparameter optimization as a pure-exploration non-stochastic infinite-armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations. Expand
No-regret Bayesian Optimization with Unknown Hyperparameters
TLDR
This paper presents the first BO algorithm that is provably no-regret and converges to the optimum without knowledge of the hyperparameters, and proposes several practical algorithms that achieve the empirical sample efficiency of BO with online hyperparameter estimation, but retain theoretical convergence guarantees. Expand
Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves
TLDR
This paper mimics the early termination of bad runs using a probabilistic model that extrapolates the performance from the first part of a learning curve, enabling state-of-the-art hyperparameter optimization methods for DNNs to find DNN settings that yield better performance than those chosen by human experts. Expand
Tunability: Importance of Hyperparameters of Machine Learning Algorithms
TLDR
Tunability is defined as the amount of performance gain that can be achieved by setting the considered hyperparameter to the best possible value instead of the default value. Expand
Model selection for support vector machines via uniform design
TLDR
A nested uniform design (UD) methodology is proposed for efficient, robust and automatic model selection for support vector machines (SVMs) and can be treated as a deterministic analog of random search. Expand
Hyperopt-Sklearn: Automatic Hyperparameter Configuration for Scikit-Learn
TLDR
This work improves on best-known scores for the model space for both MNIST and Convex Shapes, and uses Hyperopt to define a search space that encompasses many standard components and common patterns of composing them together. Expand
...
1
2
3
4
5
...