Corpus ID: 209515998

Scalable NAS with Factorizable Architectural Parameters

  title={Scalable NAS with Factorizable Architectural Parameters},
  author={Lanfei Wang and Lingxi Xie and T. Zhang and Jun Guo and Q. Tian},
  • Lanfei Wang, Lingxi Xie, +2 authors Q. Tian
  • Published 2019
  • Computer Science, Mathematics
  • ArXiv
  • Neural Architecture Search (NAS) is an emerging topic in machine learning and computer vision. The fundamental ideology of NAS is using an automatic mechanism to replace manual designs for exploring powerful network architectures. One of the key factors of NAS is to scale-up the search space, e.g., increasing the number of operators, so that more possibilities are covered, but existing search algorithms often get lost in a large number of operators. For avoiding huge computing and competition… CONTINUE READING
    2 Citations

    Figures and Tables from this paper

    Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
    • 5
    • PDF
    Angle-based Search Space Shrinking for Neural Architecture Search
    • 6
    • PDF


    Efficient Architecture Search by Network Transformation
    • 301
    ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
    • 640
    • PDF
    BayesNAS: A Bayesian Approach for Neural Architecture Search
    • 74
    • PDF
    Efficient Neural Architecture Search via Parameter Sharing
    • 1,079
    • PDF
    PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search
    • 72
    • PDF
    EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
    • 1,537
    • PDF
    Simple And Efficient Architecture Search for Convolutional Neural Networks
    • 118
    • PDF
    Searching for Activation Functions
    • 784
    • Highly Influential
    Rethinking the Inception Architecture for Computer Vision
    • 10,266
    • PDF