Corpus ID: 52071151

Neural Architecture Optimization

@inproceedings{Luo2018NeuralAO,
  title={Neural Architecture Optimization},
  author={Renqian Luo and Fei Tian and Tao Qin and Tie-Yan Liu},
  booktitle={NeurIPS},
  year={2018}
}
Automatic neural architecture design has shown its potential in discovering powerful neural network architectures. Existing methods, no matter based on reinforcement learning or evolutionary algorithms (EA), conduct architecture search in a discrete space, which is highly inefficient. In this paper, we propose a simple and efficient method to automatic neural architecture design based on continuous optimization. We call this new approach neural architecture optimization (NAO). There are three… Expand
A greedy constructive algorithm for the optimization of neural network architectures
TLDR
Numerical experiments are presented to show that the adjusted score can boost the greedy search to favor smaller architectures over larger ones without compromising the predictive performance, and generalize the definition of adjusted score from linear regression models to neural networks. Expand
Neural Architecture Search in Embedding Space
TLDR
The current experiment demonstrated that the performance of the final architecture network using the NASES procedure is comparable with that of other popular NAS approaches for the image classification task on CIFAR-10. Expand
NAT: Neural Architecture Transformer for Accurate and Compact Architectures
TLDR
Extensive experiments on two benchmark datasets demonstrate that the transformed architecture by NAT significantly outperforms both its original form and those architectures optimized by existing methods. Expand
Neural Architecture Optimization with Graph VAE
TLDR
An efficient NAS approach to optimize network architectures in a continuous space, where the latent space is built upon variational autoencoder (VAE) and graph neural networks (GNN), which not only generates appropriate continuous representations but also discovers powerful neural architectures. Expand
Towards Accurate and Compact Architectures via Neural Architecture Transformer
TLDR
A Neural Architecture Transformer (NAT) method is proposed which casts the optimization problem into a Markov Decision Process (MDP) and seeks to replace the redundant operations with more efficient operations, such as skip or null connection. Expand
Multi-objective Neural Architecture Search via Predictive Network Performance Optimization
TLDR
Inspired by the nature of the graph structure of a neural network, BOGCN-NAS is proposed, a NAS algorithm using Bayesian Optimization with Graph Convolutional Network (GCN) predictor to adaptively discover and incorporate nodes structure to approximate the performance of the architecture. Expand
A scalable algorithm for the optimization of neural network architectures
TLDR
Numerical results performed on benchmark datasets show that the proposed algorithm, called Greedy Search for Neural Network Architecture, outperforms state-of-the-art hyperparameter optimization algorithms in terms of attainable predictive performance by the selected neural network architecture, and time-to-solution for the hyperparameters optimization to complete. Expand
Neural Architecture Search with an Efficient Multiobjective Evolutionary Framework
TLDR
EMONAS is composed of a search space that considers both the macro- and micro-structure of the architecture, and a surrogate-assisted multiobjective evolutionary based algorithm that efficiently searches for the best hyperparameters using a Random Forest surrogate and guiding selection probabilities. Expand
Improving the Efficient Neural Architecture Search via Rewarding Modifications
TLDR
Improved-ENAS is proposed, a further improvement of ENAS that augments the reinforcement learning training method by modifying the reward of each tested architecture according to the results obtained in previously tested architectures. Expand
Cyclic Differentiable Architecture Search
TLDR
A novel cyclic differentiable architecture search framework (CDARTS) that builds a cyclic feedback mechanism between the search and evaluation networks, and enables the evolution of the topology to fit the final evaluation network. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Efficient Architecture Search by Network Transformation
TLDR
This paper proposes a new framework toward efficient architecture search by exploring the architecture space based on the current network and reusing its weights, and employs a reinforcement learning agent as the meta-controller, whose action is to grow the network depth or layer width with function-preserving transformations. Expand
Efficient Neural Architecture Search via Parameter Sharing
TLDR
Efficient Neural Architecture Search is a fast and inexpensive approach for automatic model design that establishes a new state-of-the-art among all methods without post-training processing and delivers strong empirical performances using much fewer GPU-hours. Expand
Designing Neural Network Architectures using Reinforcement Learning
TLDR
MetaQNN is introduced, a meta-modeling algorithm based on reinforcement learning to automatically generate high-performing CNN architectures for a given learning task that beat existing networks designed with the same layer types and are competitive against the state-of-the-art methods that use more complex layer types. Expand
Hierarchical Representations for Efficient Architecture Search
TLDR
This work efficiently discovers architectures that outperform a large number of manually designed models for image classification, obtaining top-1 error of 3.6% on CIFAR-10 and 20.3% when transferred to ImageNet, which is competitive with the best existing neural architecture search approaches. Expand
Reinforcement Learning for Architecture Search by Network Transformation
TLDR
A novel reinforcement learning framework for automatic architecture designing, where the action is to grow the network depth or layer width based on the current network architecture with function preserved, which saves a large amount of computational cost. Expand
Neural Architecture Search with Reinforcement Learning
TLDR
This paper uses a recurrent network to generate the model descriptions of neural networks and trains this RNN with reinforcement learning to maximize the expected accuracy of the generated architectures on a validation set. Expand
Progressive Neural Architecture Search
We propose a new method for learning the structure of convolutional neural networks (CNNs) that is more efficient than recent state-of-the-art methods based on reinforcement learning and evolutionaryExpand
Accelerating Neural Architecture Search using Performance Prediction
TLDR
Standard frequentist regression models can predict the final performance of partially trained model configurations using features based on network architectures, hyperparameters, and time-series validation performance data and an early stopping method is proposed, which obtains a speedup of a factor up to 6x in both hyperparameter optimization and meta-modeling. Expand
Regularized Evolution for Image Classifier Architecture Search
TLDR
This work evolves an image classifier---AmoebaNet-A---that surpasses hand-designs for the first time and gives evidence that evolution can obtain results faster with the same hardware, especially at the earlier stages of the search. Expand
Genetic CNN
  • Lingxi Xie, A. Yuille
  • Computer Science
  • 2017 IEEE International Conference on Computer Vision (ICCV)
  • 2017
TLDR
The core idea is to propose an encoding method to represent each network structure in a fixed-length binary string to efficiently explore this large search space. Expand
...
1
2
3
4
5
...