Progressive Neural Architecture Search

  title={Progressive Neural Architecture Search},
  author={Chenxi Liu and Barret Zoph and Jonathon Shlens and Wei Hua and Li-Jia Li and Li Fei-Fei and Alan Loddon Yuille and Jonathan Huang and Kevin P. Murphy},
  booktitle={European Conference on Computer Vision},
We propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionary algorithms. [] Key Method Our approach uses a sequential model-based optimization (SMBO) strategy, in which we search for structures in order of increasing complexity, while simultaneously learning a surrogate model to guide the search through structure space. Direct comparison under the same search space shows…

Randomized Search on a Grid of CNN Networks with Simplified Search Space

  • Sajad Ahmad KawaM. ArifWani
  • Computer Science
    2021 8th International Conference on Computing for Sustainable Global Development (INDIACom)
  • 2021
This paper utilizes a cell-based architecture search method, with a cell having multiple CNN operations, along with the multiple link options within the operation nodes of a cell, and proposes a novel method of performing neural architecture search.

A Review of Meta-Reinforcement Learning for Deep Neural Networks Architecture Search

This survey focuses on reviewing and discussing the current progress in automating CNN architecture search.

A Study of the Learning Progress in Neural Architecture Search Techniques

The results indicate that one-shot architecture design is an efficient alternative to architecture search by ENAS, and the learning curves are completely flat, i.e., there is no observable progress of the controller in terms of the performance of its generated architectures.

Simplified Space Based Neural Architecture Search

A simplified search space of convolutional operations with small kernel is designed and a large model based on it is constructed and an Long Short-Term Memory is used to sample child model from the large model in a way of selective activation.

DARTS: Differentiable Architecture Search

The proposed algorithm excels in discovering high-performance convolutional architectures for image classification and recurrent architectures for language modeling, while being orders of magnitude faster than state-of-the-art non-differentiable techniques.

Improving Neural Architecture Search with Reinforcement Learning

  • Computer Science
  • 2019
This work investigates the topic of neural architecture search with reinforcement learning, a method in which a recurrent network, the controller, learns to sample better convolutional architectures to improve the search strategy employed by the controller.

Multi-Branch Neural Architecture Search for Lightweight Image Super-resolution

A new search method for the SISR that can significantly reduce the overall design time by applying a weight-sharing scheme and employs a multi-branch structure to enlarge the search space for capturing multi-scale features, resulting in better reconstruction on the textured region.

Architecture Optimization for Multiple Instance Learning Neural Networks

It is concluded that while Neural Architecture Search provides a practical framework to optimize MIL network architectures, search space design is crucial to architecture optimization.

Task-Aware Neural Architecture Search

A novel framework for neural architecture search is proposed, utilizing a dictionary of models of base tasks and the similarity between the target task and the atoms of the dictionary, generating an adaptive search space based on the base models of the Dictionary.



Efficient Architecture Search by Network Transformation

This paper proposes a new framework toward efficient architecture search by exploring the architecture space based on the current network and reusing its weights, and employs a reinforcement learning agent as the meta-controller, whose action is to grow the network depth or layer width with function-preserving transformations.

Reinforcement Learning for Architecture Search by Network Transformation

A novel reinforcement learning framework for automatic architecture designing, where the action is to grow the network depth or layer width based on the current network architecture with function preserved, which saves a large amount of computational cost.

Evolving Deep Neural Networks

Simple And Efficient Architecture Search for Convolutional Neural Networks

Surprisingly, this simple method to automatically search for well-performing CNN architectures based on a simple hill climbing procedure whose operators apply network morphisms, followed by short optimization runs by cosine annealing yields competitive results.

Neural Architecture Search with Reinforcement Learning

This paper uses a recurrent network to generate the model descriptions of neural networks and trains this RNN with reinforcement learning to maximize the expected accuracy of the generated architectures on a validation set.

Genetic CNN

  • Lingxi XieA. Yuille
  • Computer Science
    2017 IEEE International Conference on Computer Vision (ICCV)
  • 2017
The core idea is to propose an encoding method to represent each network structure in a fixed-length binary string to efficiently explore this large search space.

Hierarchical Representations for Efficient Architecture Search

This work efficiently discovers architectures that outperform a large number of manually designed models for image classification, obtaining top-1 error of 3.6% on CIFAR-10 and 20.3% when transferred to ImageNet, which is competitive with the best existing neural architecture search approaches.

Designing Neural Network Architectures using Reinforcement Learning

MetaQNN is introduced, a meta-modeling algorithm based on reinforcement learning to automatically generate high-performing CNN architectures for a given learning task that beat existing networks designed with the same layer types and are competitive against the state-of-the-art methods that use more complex layer types.

Practical Network Blocks Design with Q-Learning

This work provides a solution to automatically and efficiently design high performance network architectures by focusing on constructing network blocks, which can be stacked to generate the whole network.

Efficient Neural Architecture Search via Parameter Sharing

Efficient Neural Architecture Search is a fast and inexpensive approach for automatic model design that establishes a new state-of-the-art among all methods without post-training processing and delivers strong empirical performances using much fewer GPU-hours.