VAE with a VampPrior

@inproceedings{Tomczak2018VAEWA,
  title={VAE with a VampPrior},
  author={Jakub M. Tomczak and Max Welling},
  booktitle={AISTATS},
  year={2018}
}
Many different methods to train deep generative models have been introduced in the past. In this paper, we propose to extend the variational auto-encoder (VAE) framework with a new type of prior which we call "Variational Mixture of Posteriors" prior, or VampPrior for short. The VampPrior consists of a mixture distribution (e.g., a mixture of Gaussians) with components given by variational posteriors conditioned on learnable pseudo-inputs. We further extend this prior to a two layer… CONTINUE READING
Tweets
This paper has been referenced on Twitter 1 time. VIEW TWEETS

Citations

Publications citing this paper.
SHOWING 1-10 OF 56 CITATIONS, ESTIMATED 57% COVERAGE

98 Citations

020406080201720182019
Citations per Year
Semantic Scholar estimates that this publication has 98 citations based on the available data.

See our FAQ for additional information.

References

Publications referenced by this paper.
SHOWING 1-10 OF 42 REFERENCES

Improved variational inference with inverse autoregressive flow

  • D. P. Kingma, T. Salimans, R. Jozefowicz, X. Chen, I. Sutskever, M. Welling
  • NIPS, pages 4743–4751,
  • 2016
Highly Influential
4 Excerpts

Similar Papers

Loading similar papers…