Corpus ID: 220364453

Gradient Origin Networks

@article{BondTaylor2020GradientON,
  title={Gradient Origin Networks},
  author={Sam Bond-Taylor and Chris G. Willcocks},
  journal={ArXiv},
  year={2020},
  volume={abs/2007.02798}
}
  • Sam Bond-Taylor, Chris G. Willcocks
  • Published 2020
  • Computer Science
  • ArXiv
  • This paper proposes a new type of generative model that is able to quickly learn a latent representation without an encoder. This is achieved by initialising a latent vector with zeros, then using gradients of the data fitting loss with respect to this zero vector as new latent points. The approach has similar characteristics to autoencoders but with a simpler naturally balanced architecture, and is demonstrated in a variational autoencoder equivalent that permits sampling. This also allows… CONTINUE READING

    Figures and Topics from this paper.

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 12 REFERENCES
    Adam: A Method for Stochastic Optimization
    • 50,848
    • PDF
    Generative Adversarial Nets
    • 18,171
    • PDF
    Auto-Encoding Variational Bayes
    • 9,353
    • PDF
    Pixel Recurrent Neural Networks
    • 1,244
    • PDF
    Variational Lossy Autoencoder
    • 358
    • PDF
    Variational Inference with Normalizing Flows
    • 1,148
    • PDF
    Implicit Generation and Generalization in Energy-Based Models
    • 59
    • PDF
    Implicit Neural Representations with Periodic Activation Functions
    • 17
    • PDF
    Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains
    • 11
    • PDF