Corpus ID: 52936435

CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms

@article{Kim2018CHOPTA,
  title={CHOPT : Automated Hyperparameter Optimization Framework for Cloud-Based Machine Learning Platforms},
  author={Jingwoong Kim and Minkyu Kim and Heungseok Park and Ernar Kusdavletov and Dongjun Lee and A. Kim and Ji-Hoon Kim and Jung-Woo Ha and Nako Sung},
  journal={ArXiv},
  year={2018},
  volume={abs/1810.03527}
}
Many hyperparameter optimization (HyperOpt) methods assume restricted computing resources and mainly focus on enhancing performance. Here we propose a novel cloud-based HyperOpt (CHOPT) framework which can efficiently utilize shared computing resources while supporting various HyperOpt algorithms. We incorporate convenient web-based user interfaces, visualization, and analysis tools, enabling users to easily control optimization procedures and build up valuable insights with an iterative… Expand
Stage-based Hyper-parameter Optimization for Deep Learning
HyperTendril: Visual Analytics for User-Driven Hyperparameter Optimization of Deep Neural Networks
Hippo: Taming Hyper-parameter Optimization of Deep Learning with Stage Trees
A System for Massively Parallel Hyperparameter Tuning
Massively Parallel Hyperparameter Tuning
Understanding and optimizing packed neural network training for hyper-parameter tuning

References

SHOWING 1-10 OF 26 REFERENCES
NSML: A Machine Learning Platform That Enables You to Focus on Your Models
Tune: A Research Platform for Distributed Model Selection and Training
BOHB: Robust and Efficient Hyperparameter Optimization at Scale
Google Vizier: A Service for Black-Box Optimization
Automatic differentiation in PyTorch
Population Based Training of Neural Networks
Wide Residual Networks
TensorFlow: Large-Scale Machine Learning on Heterogeneous Distributed Systems
StarGAN: Unified Generative Adversarial Networks for Multi-domain Image-to-Image Translation
...
1
2
3
...