Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis For DDIM-Type Samplers

@article{Chen2023RestorationDegradationBL,
  title={Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis For DDIM-Type Samplers},
  author={Sitan Chen and Giannis Daras and Alexandros G. Dimakis},
  journal={ArXiv},
  year={2023},
  volume={abs/2303.03384}
}
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling. Several recent works have analyzed stochastic samplers using tools like Girsanov's theorem and a chain rule variant of the interpolation argument. Unfortunately, these techniques give vacuous bounds when applied to deterministic samplers. We give a new operational interpretation for deterministic sampling by showing that one step along the probability flow ODE can be expressed as… 

Error Bounds for Flow Matching Methods

Score-based generative models are a popular class of generative modelling techniques relying on stochastic differential equations (SDE). From their inception, it was realized that it was also

The probability flow ODE is provably fast

This work provides the first polynomial-time convergence guarantees for the probability flow ODE implementation (together with a corrector step) of score-based generative modeling and obtains better dimension dependence than prior works on DDPM.

Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling

Diffusion SB (DSB), an original approximation of the Iterative Proportional Fitting (IPF) procedure to solve the SB problem, is presented, and theoretical analysis along with generative modeling experiments are provided.

Convergence of denoising diffusion models under the manifold hypothesis

This paper provides the first convergence results for diffusion models in this more general setting by providing quantitative bounds on the Wasserstein distance of order one between the target data distribution and the generative distribution of the diffusion model.

Generative Modeling with Denoising Auto-Encoders and Langevin Sampling

It is shown that both DAE and DSM provide estimates of the score of the Gaussian smoothed population density, allowing the machinery of Empirical Processes to apply to the homotopy method of arXiv:1907.05600.

gDDIM: Generalized denoising diffusion implicit models

This work examines the mechanism of DDIM from a numerical perspective, and discovers that the DDIM can be obtained by using some specific approximations of the score when solving the corresponding stochastic differential equation.

Score-Based Generative Modeling through Stochastic Differential Equations

This work presents a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms the prior distribution back into the data distribution by Slowly removing the noise.

Denoising Diffusion Implicit Models

Denoising diffusion implicit models (DDIMs) are presented, a more efficient class of iterative implicit probabilistic models with the same training procedure as DDPMs that can produce high quality samples faster and perform semantically meaningful image interpolation directly in the latent space.

Cold Diffusion: Inverting Arbitrary Image Transforms Without Noise

It is observed that the generative behavior of diffusion models is not strongly dependent on the choice of image degradation, and in fact an entire family of generative models can be constructed by varying this choice.

Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions

This work provides theoretical convergence guarantees for score-based generative models (SGMs) such as denoising diffusion probabilistic models (DDPMs) and provides evidence that the use of the CLD does not reduce the complexity of SGMs.

Improved Denoising Diffusion Probabilistic Models

Denoising diffusion probabilistic models are a class of generative models which have recently been shown to produce excellent samples and it is found that learning variances of the reverse diffusion process allows sampling with an order of magnitude fewer forward passes with a negligible difference in sample quality.

Progressive Distillation for Fast Sampling of Diffusion Models

This work presents a method to distill a trained deterministic diffusion sampler, using many steps, into a new diffusion model that takes half as many sampling steps, and shows that the full progressive distillation procedure does not take more time than it takes to train the original model.