Corpus ID: 219636245

Few-shot Neural Architecture Search

  title={Few-shot Neural Architecture Search},
  author={Yiyang Zhao and Linnan Wang and Yuandong Tian and Rodrigo Fonseca and Tian Guo},
  • Yiyang Zhao, Linnan Wang, +2 authors Tian Guo
  • Published 2020
  • Computer Science
  • ArXiv
  • To improve the search efficiency for Neural Architecture Search (NAS), One-shot NAS proposes to train a single super-net to approximate the performance of proposal architectures during search via weight-sharing. While this greatly reduces the computation cost, due to approximation error, the performance prediction by a single super-net is less accurate than training each proposal architecture from scratch, leading to search inefficiency. In this work, we propose few-shot NAS that explores the… CONTINUE READING
    3 Citations

    Figures and Tables from this paper


    Balanced One-shot Neural Architecture Optimization
    • 5
    • PDF
    Single Path One-Shot Neural Architecture Search with Uniform Sampling
    • 198
    • PDF
    NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural Architecture Search
    • 39
    • PDF
    Evaluating the Search Phase of Neural Architecture Search
    • 150
    • PDF
    NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture Search
    • 78
    • Highly Influential
    • PDF
    Neural Architecture Search Using Deep Neural Networks and Monte Carlo Tree Search
    • 8
    • PDF
    Understanding and Simplifying One-Shot Architecture Search
    • 293
    • PDF
    Efficient Neural Architecture Search via Parameter Sharing
    • 1,078
    • PDF
    PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search
    • 106