• Corpus ID: 238856710

Diffusion Normalizing Flow

@article{Zhang2021DiffusionNF,
  title={Diffusion Normalizing Flow},
  author={Qinsheng Zhang and Yongxin Chen},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.07579}
}
We present a novel generative modeling method called diffusion normalizing flow based on stochastic differential equations (SDEs). The algorithm consists of two neural SDEs: a forward SDE that gradually adds noise to the data to transform the data into Gaussian random noise, and a backward SDE that gradually removes the noise to sample from the data distribution. By jointly training the two neural SDEs to minimize a common cost function that quantifies the difference between the two, the… 
Generalized Normalizing Flows via Markov Chains
TLDR
This chapter considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows how many state-of-theart models for data generation fit into this framework.
A Unified Approach to Variational Autoencoders and Stochastic Normalizing Flows via Markov Chains
TLDR
This paper considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows that many state-of-the-art models for data generation fit into this framework.
Bayesian Learning via Neural Schrödinger-Föllmer Flows
TLDR
A new framework for approximate Bayesian inference in large datasets based on stochastic control is explored and the existing theoretical guarantees of this framework are discussed and adapted.

References

SHOWING 1-10 OF 43 REFERENCES
Improved Denoising Diffusion Probabilistic Models
TLDR
This work shows that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality, and finds that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality.
Generative Modeling by Estimating Gradients of the Data Distribution
TLDR
A new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching, which allows flexible model architectures, requires no sampling during training or the use of adversarial methods, and provides a learning objective that can be used for principled model comparisons.
Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design
TLDR
Flow++ is proposed, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks, and has begun to close the significant performance gap that has so far existed between autoregressive models and flow- based models.
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
TLDR
This work develops an approach to systematically and slowly destroy structure in a data distribution through an iterative forward diffusion process, then learns a reverse diffusion process that restores structure in data, yielding a highly flexible and tractable generative model of the data.
Neural Autoregressive Flows
TLDR
It is demonstrated that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions.
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
TLDR
This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
Residual Flows for Invertible Generative Modeling
TLDR
The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.
Variational Inference with Normalizing Flows
TLDR
It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.
A RAD approach to deep mixture models
TLDR
This Real and Discrete (RAD) approach retains the desirable normalizing flow properties of exact sampling, exact inference, and analytically computable probabilities, while at the same time allowing simultaneous modeling of both continuous and discrete structure in a data distribution.
...
1
2
3
4
5
...