Scalable NAS with Factorizable Architectural Parameters
@article{Wang2019ScalableNW, title={Scalable NAS with Factorizable Architectural Parameters}, author={Lanfei Wang and Lingxi Xie and T. Zhang and Jun Guo and Q. Tian}, journal={ArXiv}, year={2019}, volume={abs/1912.13256} }
Neural Architecture Search (NAS) is an emerging topic in machine learning and computer vision. The fundamental ideology of NAS is using an automatic mechanism to replace manual designs for exploring powerful network architectures. One of the key factors of NAS is to scale-up the search space, e.g., increasing the number of operators, so that more possibilities are covered, but existing search algorithms often get lost in a large number of operators. For avoiding huge computing and competition… CONTINUE READING
Figures and Tables from this paper
2 Citations
Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
- Computer Science
- ArXiv
- 2020
- 5
- PDF
References
SHOWING 1-10 OF 64 REFERENCES
ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
- Computer Science, Mathematics
- ICLR
- 2019
- 640
- PDF
BayesNAS: A Bayesian Approach for Neural Architecture Search
- Computer Science, Mathematics
- ICML
- 2019
- 74
- PDF
PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search
- Computer Science
- ICLR
- 2020
- 106
Efficient Neural Architecture Search via Parameter Sharing
- Computer Science, Mathematics
- ICML
- 2018
- 1,079
- PDF
PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search
- Computer Science
- ArXiv
- 2019
- 72
- PDF
EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks
- Computer Science, Mathematics
- ICML
- 2019
- 1,537
- PDF
Simple And Efficient Architecture Search for Convolutional Neural Networks
- Computer Science, Mathematics
- ICLR
- 2018
- 118
- PDF
Rethinking the Inception Architecture for Computer Vision
- Computer Science
- 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)
- 2016
- 10,266
- PDF