• Corpus ID: 208637478

Normalizing Flows for Probabilistic Modeling and Inference

@article{Papamakarios2021NormalizingFF,
  title={Normalizing Flows for Probabilistic Modeling and Inference},
  author={George Papamakarios and Eric T. Nalisnick and Danilo Jimenez Rezende and Shakir Mohamed and Balaji Lakshminarayanan},
  journal={J. Mach. Learn. Res.},
  year={2021},
  volume={22},
  pages={57:1-57:64}
}
Normalizing flows provide a general mechanism for defining expressive probability distributions, only requiring the specification of a (usually simple) base distribution and a series of bijective transformations. There has been much recent work on normalizing flows, ranging from improving their expressive power to expanding their application. We believe the field has now matured and is in need of a unified perspective. In this review, we attempt to provide such a perspective by describing flows… 

Figures from this paper

Graphical Normalizing Flows

TLDR
The graphical normalizing flow is proposed, a new invertible transformation with either a prescribed or a learnable graphical structure that provides a promising way to inject domain knowledge into normalizing flows while preserving both the interpretability of Bayesian networks and the representation capacity ofnormalizing flows.

Principled Interpolation in Normalizing Flows

TLDR
This paper uses the Dirichlet and von Mises-Fisher base distributions to enforce a fixed norm and change the base distribution, to allow for a principled way of interpolation, and shows superior performance in terms of bits per dimension, Fr\'echet Inception Distance (FID), and Kernel Inception distance (KID) scores.

Mixture of Discrete Normalizing Flows for Variational Inference

TLDR
This work presents a novel algorithm for modeling the posterior distribution of models with discrete latent variables, based on boosting variational inference, and considers mixtures of discrete normalizing flows instead.

Automatic variational inference with cascading flows

TLDR
Cascading flows are introduced, a new family of variational programs that can be constructed automatically from an input probabilistic program and can also be amortized automatically that have much higher performance than both normalizing flows and ASVI in a large set of structured inference problems.

Transforming Gaussian Processes With Normalizing Flows

TLDR
A variational approximation to the resulting Bayesian inference problem is derived, which is as fast as stochastic variational GP regression and makes the model a computationally efficient alternative to other hierarchical extensions of GP priors.

Stochastic Normalizing Flows

TLDR
Stochastic Normalizing Flows (SNF) is proposed -- an arbitrary sequence of deterministic invertible functions and stochastic sampling blocks that illustrate the representational power, sampling efficiency and asymptotic correctness of SNFs on several benchmarks including applications to sampling molecular systems in equilibrium.

Approximate Probabilistic Inference with Composed Flows

TLDR
This work proposes a framework for probabilistic inference that trains a new generative model with the property that its composition with the given model approximates the target conditional distribution and can efficiently train it using variational inference and also handle conditioning under arbitrary differentiable transformations.

Stochastic Normalizing Flows

  • Hao Wu
  • Computer Science, Mathematics
  • 2020
TLDR
Stochastic Normalizing Flows (SNF) is proposed – an arbitrary sequence of deterministic invertible functions and stochastic sampling blocks and illustrated the representational power, sampling efficiency and asymptotic correctness of SNFs on several benchmarks including applications to sampling molecular systems in equilibrium.

Normalizing Flows: An Introduction and Review of Current Methods

TLDR
The goal of this survey article is to give a coherent and comprehensive review of the literature around the construction and use of Normalizing Flows for distribution learning to provide context and explanation of the models.

Flexible Approximate Inference via Stratified Normalizing Flows

TLDR
An approximate inference procedure is developed that allows explicit control of the bias/variance tradeoff, interpolating between the sampling and the variational regime, and uses a normalizing flow to map the integrand onto a uniform distribution.
...

References

SHOWING 1-10 OF 150 REFERENCES

Variational Inference with Normalizing Flows

TLDR
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

Latent Normalizing Flows for Discrete Sequences

TLDR
A VAE-based generative model is proposed which jointly learns a normalizing flow-based distribution in the latent space and a stochastic mapping to an observed discrete space in this setting, finding that it is crucial for the flow- based distribution to be highly multimodal.

Normalizing Flows: Introduction and Ideas

TLDR
A Normalizing Flow (NF) is family of generative models which produces tractable distributions where both sampling and density evaluation can be efficient and exact.

Localised Generative Flows

TLDR
It is proved that a flow must become arbitrarily numerically noninvertible in order to approximate the target closely, and proposed Continuously Indexed Flows (CIFs) are proposed, which replace the single bijection used by normalising flows with a continuously indexed family of bijections.

Residual Flows for Invertible Generative Modeling

TLDR
The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.

Normalizing Flows: An Introduction and Review of Current Methods

TLDR
The goal of this survey article is to give a coherent and comprehensive review of the literature around the construction and use of Normalizing Flows for distribution learning to provide context and explanation of the models.

Auto-Encoding Variational Bayes

TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

Learning in Implicit Generative Models

TLDR
This work develops likelihood-free inference methods and highlight hypothesis testing as a principle for learning in implicit generative models, using which it is able to derive the objective function used by GANs, and many other related objectives.

Discrete Flows: Invertible Generative Models of Discrete Data

TLDR
It is shown that flows can in fact be extended to discrete events---and under a simple change-of-variables formula not requiring log-determinant-Jacobian computations.

Sylvester Normalizing Flows for Variational Inference

TLDR
Sylvester normalizing flows remove the well-known single-unit bottleneck from planar flows, making a single transformation much more flexible, and are compared against planarflows and inverse autoregressive flows.
...