Evolving Search Space for Neural Architecture Search

@article{Ci2021EvolvingSS,
  title={Evolving Search Space for Neural Architecture Search},
  author={Yuanzheng Ci and Chen Lin and Ming Sun and Boyu Chen and Hongwen Zhang and Wanli Ouyang},
  journal={2021 IEEE/CVF International Conference on Computer Vision (ICCV)},
  year={2021},
  pages={6639-6649}
}
Automation of neural architecture design has been a coveted alternative to human experts. Various search methods have been proposed aiming to find the optimal architecture in the search space. One would expect the search results to improve when the search space grows larger since it would potentially contain more performant candidates. Surprisingly, we observe that enlarging search space is unbeneficial or even detrimental to existing NAS methods such as DARTS, ProxylessNAS, and SPOS. This… 

Heed the noise in performance evaluations in neural architecture search

TLDR
This work proposes to reduce noise in architecture evaluations by evaluating architectures based on average performance over multiple network training runs using different random seeds and cross-validation, and shows that reducing noise in Architecture evaluations enables finding better architectures by all considered search algorithms.

FBNetV5: Neural Architecture Search for Multiple Tasks in One Run

TLDR
FBNetV5 is proposed, a NAS framework that can search for neural architectures for a variety of vision tasks with much reduced computational cost and human effort and outperformed the previous stateof-the-art in all the three tasks.

Searching the Search Space of Vision Transformer

TLDR
The central idea is to gradually evolve different search dimensions guided by their E-T Error computed using a weight-sharing supernet, and provides design guidelines of general vision transformers with extensive analysis according to the space searching process, which could promote the understanding of vision transformer.

GLiT: Neural Architecture Search for Global and Local Image Transformer

TLDR
This work introduces the first Neural Architecture Search (NAS) method to find a better transformer architecture for image recognition and introduces a locality module that models the local correlations in images explicitly with fewer computational cost.

A Review on Plastic Artificial Neural Networks: Exploring the Intersection between Neural Architecture Search and Continual Learning

TLDR
This study is the first extensive review on the intersection between AutoML and CL, outlining research directions for the different methods that can facilitate full automation and lifelong plasticity in ANNs.

Evaluating Efficient Performance Estimators of Neural Architectures

TLDR
An extensive and organized assessment of OSEs and ZSEs on NAS benchmarks: NAS-Bench-101/201/301, and NDS ResNet/ResNeXt-A and gives out suggestions for future application and development of efficient architecture performance estimators.

A Survey on Evolutionary Construction of Deep Neural Networks

TLDR
An insight is provided into the automated DNN construction process by formulating it into a multi-level multi-objective large-scale optimization problem with constraints, where the non-convex, non-differentiable and black-box nature of this problem makes evolutionary algorithms (EAs) to stand out as a promising solver.

Structural Learning in Artificial Neural Networks: A Neural Operator Perspective

TLDR
This review provides a survey on structural learning methods in deep ANNs, including a new neural operator framework from a cellular neuroscience context and perspective aimed at motivating research on this challenging topic.

Field-wise Embedding Size Search via Structural Hard Auxiliary Mask Pruning for Click-Through Rate Prediction

TLDR
This paper proposes a novel strategy that searches for the optimal mixed-dimension embedding scheme by structurally pruning a super-net via Hard Auxiliary Mask using a simple and efficient gradient-based method.

Reducing neural architecture search spaces with training-free statistics and computational graph clustering

TLDR
Clustering-Based REDuction (C-BRED) is presented, a new technique to reduce the size of NAS search spaces by clustering the computational graphs associated with its architectures and selecting the most promising cluster using proxy statistics correlated with network accuracy.

References

SHOWING 1-10 OF 63 REFERENCES

Evaluating the Search Phase of Neural Architecture Search

TLDR
This paper finds that on average, the state-of-the-art NAS algorithms perform similarly to the random policy; the widely-used weight sharing strategy degrades the ranking of the NAS candidates to the point of not reflecting their true performance, thus reducing the effectiveness of the search process.

Angle-based Search Space Shrinking for Neural Architecture Search

TLDR
Comp comprehensive evidences are provided showing that, in weight-sharing supernet, the proposed metric is more stable and accurate than accuracy-based and magnitude-based metrics to predict the capability of child models.

A Survey on Neural Architecture Search

TLDR
This survey provides a formalism which unifies and categorizes the landscape of existing methods along with a detailed analysis that compares and contrasts the different approaches.

EcoNAS: Finding Proxies for Economical Neural Architecture Search

TLDR
A reliable proxy is presented and a hierarchical proxy strategy is formulated that spends more computations on candidate networks that are potentially more accurate, while discards unpromising ones in early stage with a fast proxy to lead to an economical evolutionary-based NAS (EcoNAS), which achieves an impressive 400×search time reduction.

Efficient Neural Architecture Search via Parameter Sharing

TLDR
Efficient Neural Architecture Search is a fast and inexpensive approach for automatic model design that establishes a new state-of-the-art among all methods without post-training processing and delivers strong empirical performances using much fewer GPU-hours.

Regularized Evolution for Image Classifier Architecture Search

TLDR
This work evolves an image classifier---AmoebaNet-A---that surpasses hand-designs for the first time and gives evidence that evolution can obtain results faster with the same hardware, especially at the earlier stages of the search.

FP-NAS: Fast Probabilistic Neural Architecture Search

TLDR
This work proposes a sampling method adaptive to the distribution entropy, drawing more samples to encourage explorations at the beginning, and reducing samples as learning proceeds to search fast in the multivariate space, and calls this method Fast Probabilistic NAS.

Understanding and Simplifying One-Shot Architecture Search

TLDR
With careful experimental analysis, it is shown that it is possible to efficiently identify promising architectures from a complex search space without either hypernetworks or reinforcement learning controllers.

How Does Supernet Help in Neural Architecture Search?

TLDR
A comprehensive analysis on five search spaces, including NAS- Bench-101, NAS-Bench-201, DARTS-CIFAR10, DARts-PTB, and ProxylessNAS, finds a well-trained supernet is not necessarily a good architecture-ranking model and it is easier to find better architectures from an effectively pruned search space with supernet training.

NSGA-NET: A Multi-Objective Genetic Algorithm for Neural Architecture Search

TLDR
Experimental results suggest that combining the objectives of minimizing both an error metric and computational complexity, as measured by FLOPS, allows NSGA-Net to find competitive neural architectures near the Pareto front of both objectives on two different tasks, object classification and object alignment.
...