• Mathematics, Computer Science
  • Published in ICML 2019

NAS-Bench-101: Towards Reproducible Neural Architecture Search

@article{Ying2019NASBench101TR,
  title={NAS-Bench-101: Towards Reproducible Neural Architecture Search},
  author={Chris Ying and Aaron Klein and Esteban Real and Eric L. Christiansen and Kevin Murphy and Frank Hutter},
  journal={ArXiv},
  year={2019},
  volume={abs/1902.09635}
}
Recent advances in neural architecture search (NAS) demand tremendous computational resources, which makes it difficult to reproduce experiments and imposes a barrier-to-entry to researchers without access to large-scale computation. We aim to ameliorate these problems by introducing NAS-Bench-101, the first public architecture dataset for NAS research. To build NAS-Bench-101, we carefully constructed a compact, yet expressive, search space, exploiting graph isomorphisms to identify 423k unique… CONTINUE READING

Citations

Publications citing this paper.
SHOWING 1-10 OF 37 CITATIONS

Best of Both Worlds: AutoML Codesign of a CNN and its Hardware Accelerator

VIEW 10 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search

VIEW 10 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS

NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search

VIEW 6 EXCERPTS
CITES BACKGROUND, RESULTS & METHODS
HIGHLY INFLUENCED

ONE-SHOT NEURAL ARCHITECTURE SEARCH

VIEW 10 EXCERPTS
CITES METHODS, BACKGROUND & RESULTS

PRODUCIBLE NEURAL ARCHITECTURE SEARCH

VIEW 6 EXCERPTS
CITES BACKGROUND, RESULTS & METHODS
HIGHLY INFLUENCED

To Share or Not To Share: A Comprehensive Appraisal of Weight-Sharing

VIEW 6 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search

VIEW 10 EXCERPTS
CITES BACKGROUND & METHODS
HIGHLY INFLUENCED

Multi-objective Neural Architecture Search via Predictive Network Performance Optimization

VIEW 5 EXCERPTS
CITES METHODS
HIGHLY INFLUENCED

Dynamic Distribution Pruning for Efficient Network Architecture Search

VIEW 3 EXCERPTS
CITES METHODS & BACKGROUND
HIGHLY INFLUENCED

References

Publications referenced by this paper.
SHOWING 1-10 OF 41 REFERENCES

Learning Transferable Architectures for Scalable Image Recognition

VIEW 14 EXCERPTS
HIGHLY INFLUENTIAL

Progressive Neural Architecture Search

VIEW 2 EXCERPTS
HIGHLY INFLUENTIAL

Identity Mappings in Deep Residual Networks

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Deep Residual Learning for Image Recognition

VIEW 10 EXCERPTS
HIGHLY INFLUENTIAL

Learning Multiple Layers of Features from Tiny Images

VIEW 7 EXCERPTS
HIGHLY INFLUENTIAL