Corpus ID: 219792625

Neural Architecture Optimization with Graph VAE

  title={Neural Architecture Optimization with Graph VAE},
  author={Jian Li and Y. Liu and J. Liu and Weiping Wang},
  • Jian Li, Y. Liu, +1 author Weiping Wang
  • Published 2020
  • Computer Science, Mathematics
  • ArXiv
  • Due to their high computational efficiency on a continuous space, gradient optimization methods have shown great potential in the neural architecture search (NAS) domain. The mapping of network representation from the discrete space to a latent space is the key to discovering novel architectures, however, existing gradient-based methods fail to fully characterize the networks. In this paper, we propose an efficient NAS approach to optimize network architectures in a continuous space, where the… CONTINUE READING
    1 Citations
    Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
    • 5
    • PDF


    Efficient Neural Architecture Search via Parameter Sharing
    • 1,067
    • PDF
    Neural Architecture Optimization
    • 282
    • Highly Influential
    • PDF
    D-VAE: A Variational Autoencoder for Directed Acyclic Graphs
    • 33
    • Highly Influential
    • PDF
    DARTS: Differentiable Architecture Search
    • 1,157
    • Highly Influential
    • PDF
    Neural Architecture Search with Reinforcement Learning
    • 2,281
    • PDF
    SNAS: Stochastic Neural Architecture Search
    • 362
    • PDF
    Efficient Neural Architecture Search via Proximal Iterations
    • 30
    • PDF
    ProxylessNAS: Direct Neural Architecture Search on Target Task and Hardware
    • 634
    • PDF