A Framework for Selecting Deep Learning Hyper-parameters


Recent research has found that deep learning architectures show significant improvements over traditional shallow algorithms when mining high dimensional datasets. When the choice of algorithm employed, hyper-parameter setting, number of hidden layers and nodes within a layer are combined, the identification of an optimal configuration can be a lengthy process. Our work provides a framework for building deep learning architectures via a stepwise approach, together with an evaluation methodology to quickly identify poorly performing architectural configurations. Using a dataset with high dimensionality, we illustrate how different architectures perform and how one algorithm configuration can provide input for fine-tuning more complex models.

DOI: 10.1007/978-3-319-20424-6_12

Extracted Key Phrases

5 Figures and Tables

Cite this paper

@inproceedings{ODonoghue2015AFF, title={A Framework for Selecting Deep Learning Hyper-parameters}, author={Jim O'Donoghue and Mark Roantree}, booktitle={BICOD}, year={2015} }