Few-shot Neural Architecture Search
@article{Zhao2020FewshotNA, title={Few-shot Neural Architecture Search}, author={Yiyang Zhao and Linnan Wang and Yuandong Tian and Rodrigo Fonseca and Tian Guo}, journal={ArXiv}, year={2020}, volume={abs/2006.06863} }
To improve the search efficiency for Neural Architecture Search (NAS), One-shot NAS proposes to train a single super-net to approximate the performance of proposal architectures during search via weight-sharing. While this greatly reduces the computation cost, due to approximation error, the performance prediction by a single super-net is less accurate than training each proposal architecture from scratch, leading to search inefficiency. In this work, we propose few-shot NAS that explores the… CONTINUE READING
Figures and Tables from this paper
3 Citations
Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
- Computer Science
- ArXiv
- 2020
- 5
- PDF
References
SHOWING 1-10 OF 47 REFERENCES
NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search
- Computer Science, Mathematics
- ICLR
- 2020
- 39
- PDF
Sample-Efficient Neural Architecture Search by Learning Action Space
- Computer Science, Mathematics
- ArXiv
- 2019
- 15
- PDF
Evaluating the Search Phase of Neural Architecture Search
- Computer Science, Mathematics
- ICLR
- 2020
- 150
- PDF
NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search
- Computer Science
- ICLR
- 2020
- 78
- Highly Influential
- PDF
Neural Architecture Search Using Deep Neural Networks and Monte Carlo Tree Search
- Computer Science
- AAAI
- 2020
- 8
- PDF
Efficient Neural Architecture Search via Parameter Sharing
- Computer Science, Mathematics
- ICML
- 2018
- 1,078
- PDF
PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search
- Computer Science
- ICLR
- 2020
- 106