Understanding DDPM Latent Codes Through Optimal Transport
@article{Khrulkov2022UnderstandingDL, title={Understanding DDPM Latent Codes Through Optimal Transport}, author={Valentin Khrulkov and I. Oseledets}, journal={ArXiv}, year={2022}, volume={abs/2202.07477} }
Diffusion models have recently outperformed alternative approaches to model the distribution of natural images, such as GANs. Such diffusion models allow for deterministic sampling via the probability flow ODE, giving rise to a latent space and an encoder map. While having important practical applications, such as estimation of the likelihood, the theoretical properties of this map are not yet fully understood. In the present work, we partially address this question for the popu-lar case of the…
3 Citations
Maximum Likelihood Training of Implicit Nonlinear Diffusion Models
- Computer ScienceArXiv
- 2022
A data-adaptive and nonlinear diffusion process for score-based diffusion models and a sampling-friendly latent diffusion that the sample trajectory of INDM is closer to an optimal transport than the trajectories of previous research is introduced.
PriorGrad: Improving Conditional Denoising Diffusion Models with Data-Dependent Adaptive Prior
- Computer Science
- 2021
PriorGrad is proposed to improve the efficiency of the conditional diffusion model for speech synthesis by applying an adaptive prior derived from the data statistics based on the conditional information and achieves faster convergence and inference with superior performance, leading to an improved perceptual quality and robustness to a smaller network capacity, and thereby demonstrating the Efficiency of a data-dependent adaptive prior.
The flow map of the Fokker–Planck equation does not provide optimal transport
- MathematicsApplied Mathematics Letters
- 2022
References
SHOWING 1-10 OF 41 REFERENCES
Improved Denoising Diffusion Probabilistic Models
- Computer ScienceICML
- 2021
This work shows that with a few simple modifications, DDPMs can also achieve competitive log-likelihoods while maintaining high sample quality, and finds that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality.
Denoising Diffusion Implicit Models
- Computer ScienceICLR
- 2021
Denoising diffusion implicit models (DDIMs) are presented, a more efficient class of iterative implicit probabilistic models with the same training procedure as DDPMs that can produce high quality samples faster and perform semantically meaningful image interpolation directly in the latent space.
ILVR: Conditioning Method for Denoising Diffusion Probabilistic Models
- Computer Science2021 IEEE/CVF International Conference on Computer Vision (ICCV)
- 2021
This work proposes Iterative Latent Variable Refinement (ILVR), a method to guide the generative process in DDPM to generate high-quality images based on a given reference image, which allows adaptation of a single DDPM without any additional learning in various image generation tasks.
Progressive Distillation for Fast Sampling of Diffusion Models
- Computer ScienceArXiv
- 2022
A method to distill a trained deterministic diffusion sampler, using many steps, into a new diffusion model that takes half as many sampling steps, and it is shown that the full progressive distillation procedure does not take more time than it takes to train the original model.
Denoising Diffusion Probabilistic Models
- Computer ScienceNeurIPS
- 2020
High quality image synthesis results are presented using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics, which naturally admit a progressive lossy decompression scheme that can be interpreted as a generalization of autoregressive decoding.
Deep Unsupervised Learning using Nonequilibrium Thermodynamics
- Computer ScienceICML
- 2015
This work develops an approach to systematically and slowly destroy structure in a data distribution through an iterative forward diffusion process, then learns a reverse diffusion process that restores structure in data, yielding a highly flexible and tractable generative model of the data.
Label-Efficient Semantic Segmentation with Diffusion Models
- Computer ScienceArXiv
- 2021
This paper investigates the intermediate activations from the networks that perform the Markov step of the reverse diffusion process and shows that these activations effectively capture the semantic information from an input image and appear to be excellent pixel-level representations for the segmentation problem.
Generative Modeling by Estimating Gradients of the Data Distribution
- Computer ScienceNeurIPS
- 2019
A new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching, which allows flexible model architectures, requires no sampling during training or the use of adversarial methods, and provides a learning objective that can be used for principled model comparisons.
Score-Based Generative Modeling through Stochastic Differential Equations
- Computer ScienceICLR
- 2021
This work presents a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms the prior distribution back into the data distribution by Slowly removing the noise.
Diffusion Models Beat GANs on Image Synthesis
- Computer ScienceNeurIPS
- 2021
It is shown that diffusion models can achieve image sample quality superior to the current state-of-the-art generative models, and classifier guidance combines well with upsampling diffusion models, further improving FID to 3.94 on ImageNet 256 × 256 and 3.85 on imageNet 512 × 512.