• Corpus ID: 52016139

Neural Architecture Search: A Survey

  title={Neural Architecture Search: A Survey},
  author={Thomas Elsken and Jan Hendrik Metzen and Frank Hutter},
Deep Learning has enabled remarkable progress over the last years on a variety of tasks, such as image recognition, speech recognition, and machine translation. One crucial aspect for this progress are novel neural architectures. Currently employed architectures have mostly been developed manually by human experts, which is a time-consuming and error-prone process. Because of this, there is growing interest in automated neural architecture search methods. We provide an overview of existing work… 

Figures from this paper

Neural Architecture Search

An overview of existing work in this field of research is provided and neural architecture search methods are categorized according to three dimensions: search space, search strategy, and performance estimation strategy.

Chapter 3 Neural Architecture Search

An overview of existing work in this field of research is provided and neural architecture search methods are categorized according to three dimensions: search space, search strategy, and performance estimation strategy.

Convolutional Neural Network Pruning: A Survey

This work pruning deep convolutional neural networks to a machine terminal remains challenging due to massive number of parameters and float operations that a typical model contains.

A technical view on neural architecture search

This paper surveys NAS from a technical view including problem definition, search approaches, progress towards practical applications and possible future directions, and drew a picture of NAS for readers to help beginners start their researches on NAS.

SuperNet in Neural Architecture Search: A Taxonomic Survey

This survey aims to provide an overview of existing works in neural architecture search methods by categorizing supernet optimization by proposing them as solutions to the common challenges found in the literature: data-side optimization, poor rank correlation alleviation, and transferable NAS for a number of deployment scenarios.

Neural Architecture Search for Dense Prediction Tasks in Computer Vision

An overview of NAS for dense prediction tasks in computer vision by elaborating on these novel challenges and surveying ways to address them to ease future research and application of existing methods.

Improving Neural Architecture Search with Reinforcement Learning

  • Computer Science
  • 2019
This work investigates the topic of neural architecture search with reinforcement learning, a method in which a recurrent network, the controller, learns to sample better convolutional architectures to improve the search strategy employed by the controller.

Analysis of Efficient Neural Architecture Search via Parameter Sharing

It is found that architectures do not improve with ENAS controller training via various experiments, and it is concluded that training of ENAs controller is not necessary and limitations of ENAS performance estimation strategy are discussed.

A Comprehensive Survey of Neural Architecture Search: Challenges and Solutions

This survey provides a new perspective on the NAS starting with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these earlyNAS algorithms, and then giving solutions for subsequent related research work.



Evolving Deep Neural Networks

Simple And Efficient Architecture Search for Convolutional Neural Networks

Surprisingly, this simple method to automatically search for well-performing CNN architectures based on a simple hill climbing procedure whose operators apply network morphisms, followed by short optimization runs by cosine annealing yields competitive results.

DARTS: Differentiable Architecture Search

The proposed algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques.

Searching for Efficient Multi-Scale Architectures for Dense Image Prediction

This work constructs a recursive search space for meta-learning techniques for dense image prediction focused on the tasks of scene parsing, person-part segmentation, and semantic image segmentation and demonstrates that even with efficient random search, this architecture can outperform human-invented architectures.

Neural Architecture Search with Reinforcement Learning

This paper uses a recurrent network to generate the model descriptions of neural networks and trains this RNN with reinforcement learning to maximize the expected accuracy of the generated architectures on a validation set.

Understanding and Simplifying One-Shot Architecture Search

With careful experimental analysis, it is shown that it is possible to efficiently identify promising architectures from a complex search space without either hypernetworks or reinforcement learning controllers.

Progressive Neural Architecture Search

We propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionary

Efficient Architecture Search by Network Transformation

This paper proposes a new framework toward efficient architecture search by exploring the architecture space based on the current network and reusing its weights, and employs a reinforcement learning agent as the meta-controller, whose action is to grow the network depth or layer width with function-preserving transformations.

Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search

This work proposes to use a recent combination of Bayesian optimization and Hyperband for efficient joint neural architecture and hyperparameter search.

DeepArchitect: Automatically Designing and Training Deep Architectures

This paper proposes an extensible and modular framework that allows the human expert to compactly represent complex search spaces over architectures and their hyperparameters and shows that the same search space achieves near state-of-the-art performance with a few samples.