Bayesian Optimization Combined with Successive Halving for Neural Network Architecture Optimization

@inproceedings{Wistuba2017BayesianOC,
  title={Bayesian Optimization Combined with Successive Halving for Neural Network Architecture Optimization},
  author={Martin Wistuba},
  booktitle={AutoML@PKDD/ECML},
  year={2017}
}
The choice of hyperparameters and the selection of algorithms is a crucial part in machine learning. Bayesian optimization methods and successive halving have been applied successfully to optimize hyperparameters automatically. Therefore, we propose to combine both methods by estimating the initial population of incremental evaluation, our variation of successive halving, by means of Bayesian optimization. We apply the proposed methodology to the challenging problem of optimizing neural network… CONTINUE READING