Corpus ID: 195767445

The Thermodynamic Variational Objective

@inproceedings{Masrani2019TheTV,
  title={The Thermodynamic Variational Objective},
  author={Vaden Masrani and T. Le and F. Wood},
  booktitle={NeurIPS},
  year={2019}
}
  • Vaden Masrani, T. Le, F. Wood
  • Published in NeurIPS 2019
  • Computer Science, Mathematics
  • We introduce the thermodynamic variational objective (TVO) for learning in both continuous and discrete deep generative models. The TVO arises from a key connection between variational inference and thermodynamic integration that results in a tighter lower bound to the log marginal likelihood than the standard variational evidence lower bound (ELBO), while remaining as broadly applicable. We provide a computationally efficient gradient estimator for the TVO that applies to continuous, discrete… CONTINUE READING

    Figures, Tables, and Topics from this paper.

    All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference
    Optimal Variance Control of the Score Function Gradient Estimator for Importance Weighted Bounds
    Capsule Networks - A Probabilistic Perspective
    NVAE: A Deep Hierarchical Variational Autoencoder
    • 10
    • PDF

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 47 REFERENCES
    Auto-Encoding Variational Bayes
    • 9,185
    • PDF
    Unbiased Implicit Variational Inference
    • 20
    • PDF
    Variational Inference for Monte Carlo Objectives
    • 183
    • PDF
    Variational Sequential Monte Carlo
    • 93
    • PDF
    Neural Variational Inference and Learning in Belief Networks
    • 522
    • PDF
    Tighter Variational Bounds are Not Necessarily Better
    • 96
    • PDF
    Filtering Variational Objectives
    • 100
    • PDF
    Doubly Reparameterized Gradient Estimators for Monte Carlo Objectives
    • 44
    • PDF
    Importance Weighted Autoencoders
    • 632
    • PDF