DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Model Sampling in Around 10 Steps

@article{Lu2022DPMSolverAF,
  title={DPM-Solver: A Fast ODE Solver for Diffusion Probabilistic Model Sampling in Around 10 Steps},
  author={Cheng Lu and Yuhao Zhou and Fan Bao and Jianfei Chen and Chongxuan Li and Jun Zhu},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.00927}
}
Diffusion probabilistic models (DPMs) are emerging powerful generative models. Despite their high-quality generation performance, DPMs still suffer from their slow sampling as they generally need hundreds or thousands of sequential function evaluations (steps) of large neural networks to draw a sample. Sampling from DPMs can be viewed alternatively as solving the corresponding diffusion ordinary differential equations (ODEs). In this work, we propose an exact formulation of the solution of… 

A Survey on Generative Diffusion Model

TLDR
A diverse range of advanced techniques to speed up the diffusion models – training schedule, training-free sampling, mixed-modeling, and score & diffusion unification are presented.

Diffusion Models: A Comprehensive Survey of Methods and Applications

TLDR
A comprehensive review of existing variants of the diffusion models and a thorough investigation into the applications of diffusion models, including computer vision, natural language processing, waveform signal processing, multi-modal modeling, molecular graph generation, time series modeling, and adversarial purification.

Diffusion Models in Vision: A Survey

TLDR
A multi-perspective categorization of diffusion models applied in computer vision, including variational auto-encoders, generative adversarial networks, energy-based models, autoregressive models and normalizing models is introduced.

Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow

We present rectified flow, a surprisingly simple approach to learning (neural) ordinary differential equation (ODE) models to transport between two empirically observed distributions π0 and π1, hence

Dual Diffusion Implicit Bridges for Image-to-Image Translation

TLDR
Dual Diffusion Implicit Bridges is presented, an image translation method based on diffusion models, that circumvents training on domain pairs and is interpreted as concatenation of source to latent, and latent to target Schrödinger Bridges, a form of entropy-regularized optimal transport.

References

SHOWING 1-10 OF 42 REFERENCES

Gotta Go Fast When Generating Data with Score-Based Models

TLDR
This work carefully devise an SDE solver with adaptive step sizes tailored to score-based generative models piece by piece, which generates data 2 to 10 times faster than EM while achieving better or equal sample quality.

Analytic-DPM: an Analytic Estimate of the Optimal Reverse Variance in Diffusion Probabilistic Models

TLDR
Analytic-DPM is proposed, a training-free inference framework that estimates the analytic forms of the variance and KL divergence of a DPM using the Monte Carlo method and a pretrained score-based model to correct the potential bias caused by the score- based model.

Learning Fast Samplers for Diffusion Models by Differentiating Through Sample Quality

TLDR
This work introduces Differentiable Diffusion Sampler Search (DDSS), a method that optimizes fast samplers for any pre-trained diffusion model by differentiating through sample quality scores, and shows that optimizing the degrees of freedom of GGDMSamplers by maximizing samplequality scores via gradient descent leads to improved sample quality.

Variational Diffusion Models

TLDR
A family of diffusion-based generative models that obtain state-of-the-art likelihoods on standard image density estimation benchmarks, outperforming autoregressive models that have dominated these benchmarks for many years, with often faster optimization.

Diffusion Models Beat GANs on Image Synthesis

TLDR
It is shown that diffusion models can achieve image sample quality superior to the current state-of-the-art generative models, and classifier guidance combines well with upsampling diffusion models, further improving FID to 3.94 on ImageNet 256 × 256 and 3.85 on imageNet 512 × 512.

Improved Denoising Diffusion Probabilistic Models

TLDR
This work shows that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality, and finds that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality.

Score-Based Generative Modeling through Stochastic Differential Equations

TLDR
This work presents a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms the prior distribution back into the data distribution by Slowly removing the noise.

Denoising Diffusion Implicit Models

TLDR
Denoising diffusion implicit models (DDIMs) are presented, a more efficient class of iterative implicit probabilistic models with the same training procedure as DDPMs that can produce high quality samples faster and perform semantically meaningful image interpolation directly in the latent space.

Denoising Diffusion Probabilistic Models

TLDR
High quality image synthesis results are presented using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics, which naturally admit a progressive lossy decompression scheme that can be interpreted as a generalization of autoregressive decoding.

Pseudo Numerical Methods for Diffusion Models on Manifolds

TLDR
A fresh perspective that DDPMs should be treated as solving differential equations on manifolds is provided and pseudo numerical methods for diffusion models (PNDMs) are proposed, finding that the pseudo linear multi-step method is the best in most situations.