• Corpus ID: 240288661

Resampling Base Distributions of Normalizing Flows

@inproceedings{Stimper2021ResamplingBD,
  title={Resampling Base Distributions of Normalizing Flows},
  author={Vincent Stimper and Bernhard Sch{\"o}lkopf and Jos{\'e} Miguel Hern{\'a}ndez-Lobato},
  booktitle={International Conference on Artificial Intelligence and Statistics},
  year={2021}
}
Normalizing flows are a popular class of models for approximating probability distributions. However, their invertible nature lim-its their ability to model target distributions whose support have a complex topological structure, such as Boltzmann distributions. Several procedures have been proposed to solve this problem but many of them sacrifice invertibility and, thereby, tractability of the log-likelihood as well as other desir-able properties. To address these limitations, we introduce a… 

Nonlinear MCMC for Bayesian Machine Learning

A convergence guarantee in total variation that uses novel results for long-time convergence and large-particle (“propagation of chaos”) convergence is provided.

Designing losses for data-free training of normalizing flows on Boltzmann distributions

This work shows for the first time that imperfect pre-trained models can be further optimized in the absence of training data and proposes strategies to alleviate these issues, most importantly a new loss function well-grounded in theory and with suitable optimization properties.

Long-Time Convergence and Propagation of Chaos for Nonlinear MCMC

This paper studies the long-time convergence and uniform strong propagation of chaos for a class of nonlinear Markov chains introduced in Andrieu et al. (2011) for Markov chain Monte Carlo (MCMC) and shows that these nonlinear MCMC techniques are viable for use in real-world highdimensional inference such as Bayesian neural networks.

Bootstrap Your Flow

This work combines importance sampling and MCMC in a method that leverages the advantages of both approaches, and uses annealed importance sampling (AIS), whereby it preserves the ability to compute importance sampling estimates, while lowering the variance of this estimate (relative to only using the proposal).

References

SHOWING 1-10 OF 54 REFERENCES

Smooth Normalizing Flows

This work introduces a class of smooth mixture transformations working on both compact intervals and hypertori and shows that parameter gradients and forces of such inverses can be computed from forward evaluations via the inverse function theorem.

Stochastic Normalizing Flows

Stochastic Normalizing Flows (SNF) is proposed -- an arbitrary sequence of deterministic invertible functions and stochastic sampling blocks that illustrate the representational power, sampling efficiency and asymptotic correctness of SNFs on several benchmarks including applications to sampling molecular systems in equilibrium.

Normalizing Flows for Probabilistic Modeling and Inference

This review places special emphasis on the fundamental principles of flow design, and discusses foundational topics such as expressive power and computational trade-offs, and summarizes the use of flows for tasks such as generative modeling, approximate inference, and supervised learning.

Residual Flows for Invertible Generative Modeling

The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

SurVAE Flows bridge the gap between normalizing flows and VAEs with surjective transformations, wherein the transformations are deterministic in one direction -- thereby allowing exact likelihood computation, and stochastic in the reverse direction -- hence providing a lower bound on the corresponding likelihood.

Training Normalizing Flows with the Information Bottleneck for Competitive Generative Classification

This work develops the theory and methodology of IB-INNs, a class of conditional normalizing flows where INNs are trained using the IB objective, and finds the trade-off parameter in the IB controls a mix of generative capabilities and accuracy close to standard classifiers.

Normalizing Flows With Multi-Scale Autoregressive Priors

The mAR prior for models with split coupling flow layers (mAR-SCF) can better capture dependencies in complex multimodal data and achieves state-of-the-art density estimation results on MNIST, CIFAR-10, and ImageNet.

Variational Inference with Normalizing Flows

It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

Variational Diffusion Models

A family of diffusion-based generative models that obtain state-of-the-art likelihoods on standard image density estimation benchmarks, outperforming autoregressive models that have dominated these benchmarks for many years, with often faster optimization.

Semi-Supervised Learning with Normalizing Flows

FlowGMM, an end-to-end approach to generative semi supervised learning with normalizing flows, using a latent Gaussian mixture model, is proposed, distinct in its simplicity, unified treatment of labelled and unlabelled data with an exact likelihood, interpretability, and broad applicability beyond image data.
...