Corpus ID: 211532790

PHS: A Toolbox for Parallel Hyperparameter Search

@article{Habelitz2020PHSAT,
  title={PHS: A Toolbox for Parallel Hyperparameter Search},
  author={P. Habelitz and Janis Keuper},
  journal={ArXiv},
  year={2020},
  volume={abs/2002.11429}
}
We introduce an open source python framework named PHS - Parallel Hyperparameter Search to enable hyperparameter optimization on numerous compute instances of any arbitrary python function. This is achieved with minimal modifications inside the target function. Possible applications appear in expensive to evaluate numerical computations which strongly depend on hyperparameters such as machine learning. Bayesian optimization is chosen as a sample efficient method to propose the next query set of… Expand

References

SHOWING 1-8 OF 8 REFERENCES
Scalable Bayesian Optimization Using Deep Neural Networks
Sequential Model-Based Optimization for General Algorithm Configuration
Random Search for Hyper-Parameter Optimization
Taking the Human Out of the Loop: A Review of Bayesian Optimization
Gaussian Processes for Machine Learning
THE BAYES METHODS FOR SEEKING THE EXTREMAL POINT
Dask Development Team, Dask: Library for dynamic task scheduling
  • 2016