# Diffusion Normalizing Flow

@article{Zhang2021DiffusionNF, title={Diffusion Normalizing Flow}, author={Qinsheng Zhang and Yongxin Chen}, journal={ArXiv}, year={2021}, volume={abs/2110.07579} }

We present a novel generative modeling method called diffusion normalizing flow based on stochastic differential equations (SDEs). The algorithm consists of two neural SDEs: a forward SDE that gradually adds noise to the data to transform the data into Gaussian random noise, and a backward SDE that gradually removes the noise to sample from the data distribution. By jointly training the two neural SDEs to minimize a common cost function that quantifies the difference between the two, the…

## Figures and Tables from this paper

## 3 Citations

Generalized Normalizing Flows via Markov Chains

- Computer Science, Mathematics
- 2021

This chapter considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows how many state-of-theart models for data generation fit into this framework.

A Unified Approach to Variational Autoencoders and Stochastic Normalizing Flows via Markov Chains

- Computer ScienceArXiv
- 2021

This paper considers stochastic normalizing flows as a pair of Markov chains fulfilling some properties and shows that many state-of-the-art models for data generation fit into this framework.

Bayesian Learning via Neural Schrödinger-Föllmer Flows

- Computer Science, MathematicsArXiv
- 2021

A new framework for approximate Bayesian inference in large datasets based on stochastic control is explored and the existing theoretical guarantees of this framework are discussed and adapted.

## References

SHOWING 1-10 OF 43 REFERENCES

Improved Denoising Diffusion Probabilistic Models

- Computer Science, MathematicsICML
- 2021

This work shows that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality, and finds that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality.

Generative Modeling by Estimating Gradients of the Data Distribution

- Computer Science, MathematicsNeurIPS
- 2019

A new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching, which allows flexible model architectures, requires no sampling during training or the use of adversarial methods, and provides a learning objective that can be used for principled model comparisons.

Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design

- Computer Science, MathematicsICML
- 2019

Flow++ is proposed, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks, and has begun to close the significant performance gap that has so far existed between autoregressive models and flow- based models.

Deep Unsupervised Learning using Nonequilibrium Thermodynamics

- Computer Science, MathematicsICML
- 2015

This work develops an approach to systematically and slowly destroy structure in a data distribution through an iterative forward diffusion process, then learns a reverse diffusion process that restores structure in data, yielding a highly flexible and tractable generative model of the data.

Neural Autoregressive Flows

- Computer Science, MathematicsICML
- 2018

It is demonstrated that the proposed neural autoregressive flows (NAF) are universal approximators for continuous probability distributions, and their greater expressivity allows them to better capture multimodal target distributions.

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models

- Computer Science, MathematicsICLR
- 2019

This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.

Auto-Encoding Variational Bayes

- Mathematics, Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

Residual Flows for Invertible Generative Modeling

- Mathematics, Computer ScienceNeurIPS
- 2019

The resulting approach, called Residual Flows, achieves state-of-the-art performance on density estimation amongst flow-based models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.

Variational Inference with Normalizing Flows

- Computer Science, MathematicsICML
- 2015

It is demonstrated that the theoretical advantages of having posteriors that better match the true posterior, combined with the scalability of amortized variational approaches, provides a clear improvement in performance and applicability of variational inference.

A RAD approach to deep mixture models

- Computer Science, MathematicsDGS@ICLR
- 2019

This Real and Discrete (RAD) approach retains the desirable normalizing flow properties of exact sampling, exact inference, and analytically computable probabilities, while at the same time allowing simultaneous modeling of both continuous and discrete structure in a data distribution.