Neural Architecture Optimization with Graph VAE
@article{Li2020NeuralAO, title={Neural Architecture Optimization with Graph VAE}, author={Jian Li and Y. Liu and J. Liu and Weiping Wang}, journal={ArXiv}, year={2020}, volume={abs/2006.10310} }
Due to their high computational efficiency on a continuous space, gradient optimization methods have shown great potential in the neural architecture search (NAS) domain. The mapping of network representation from the discrete space to a latent space is the key to discovering novel architectures, however, existing gradient-based methods fail to fully characterize the networks. In this paper, we propose an efficient NAS approach to optimize network architectures in a continuous space, where the… CONTINUE READING
Figures, Tables, and Topics from this paper
One Citation
Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
- Computer Science
- ArXiv
- 2020
- 5
- PDF
References
SHOWING 1-10 OF 26 REFERENCES
A Variational-Sequential Graph Autoencoder for Neural Architecture Performance Prediction
- Computer Science
- ArXiv
- 2019
- 4
- PDF
Efficient Neural Architecture Search via Parameter Sharing
- Computer Science, Mathematics
- ICML
- 2018
- 1,067
- PDF
D-VAE: A Variational Autoencoder for Directed Acyclic Graphs
- Computer Science, Mathematics
- NeurIPS
- 2019
- 33
- Highly Influential
- PDF
DARTS: Differentiable Architecture Search
- Computer Science, Mathematics
- ICLR
- 2019
- 1,157
- Highly Influential
- PDF
NAS-Bench-101: Towards Reproducible Neural Architecture Search
- Computer Science, Mathematics
- ICML
- 2019
- 146
- PDF
ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
- Computer Science, Mathematics
- ICLR
- 2019
- 634
- PDF