Corpus ID: 1879195

Concept Formation and Dynamics of Repeated Inference in Deep Generative Models

  title={Concept Formation and Dynamics of Repeated Inference in Deep Generative Models},
  author={Yoshihiro Nagano and Ryo Karakida and M. Okada},
  • Yoshihiro Nagano, Ryo Karakida, M. Okada
  • Published 2017
  • Computer Science, Mathematics, Biology
  • ArXiv
  • Deep generative models are reported to be useful in broad applications including image generation. Repeated inference between data space and latent space in these models can denoise cluttered images and improve the quality of inferred results. However, previous studies only qualitatively evaluated image outputs in data space, and the mechanism behind the inference has not been investigated. The purpose of the current study is to numerically analyze changes in activity patterns of neurons in the… CONTINUE READING


    Publications referenced by this paper.
    Adam: A Method for Stochastic Optimization
    • 49,112
    • Open Access
    Generative Adversarial Nets
    • 17,548
    • Open Access
    Auto-Encoding Variational Bayes
    • 9,051
    • Open Access
    Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks
    • 6,478
    • Open Access
    Stochastic Backpropagation and Approximate Inference in Deep Generative Models
    • 2,684
    • Highly Influential
    • Open Access
    Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations
    • 2,206
    • Open Access
    beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework
    • 1,322
    SeqGAN: Sequence Generative Adversarial Nets with Policy Gradient
    • 1,088
    • Open Access
    Exact solutions to the nonlinear dynamics of learning in deep linear neural networks
    • 913
    • Open Access
    Generating Videos with Scene Dynamics
    • 772
    • Open Access