Convergence for score-based generative modeling with polynomial complexity
@article{Lee2022ConvergenceFS, title={Convergence for score-based generative modeling with polynomial complexity}, author={Holden Lee and Jianfeng Lu and Yixin Tan}, journal={ArXiv}, year={2022}, volume={abs/2206.06227} }
Score-based generative modeling (SGM) is a highly successful approach for learning a probability distribution from data and generating further samples. We prove the first polynomial convergence guarantees for the core mechanic behind SGM: drawing samples from a probability density p given a score estimate (an estimate of ∇ ln p) that is accurate in L(p). Compared to previous works, we do not incur error that grows exponentially in time or that suffers from a curse of dimensionality. Our…
16 Citations
Convergence of score-based generative modeling for general data distributions
- Computer Science, MathematicsALT
- 2023
This work considers a popular kind of SGM -- denoising diffusion models -- and gives polynomial convergence guarantees for general data distributions, with no assumptions related to functional inequalities or smoothness.
Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions
- Computer ScienceArXiv
- 2022
It is shown that score-based generative models such as denoising diffusion probabilistic models (DDPMs) can efficiently sample from essentially any realistic data distribution, and theoretical convergence guarantees for these models hold for an L 2 -accurate score estimate.
Convergence in KL Divergence of the Inexact Langevin Algorithm with Application to Score-based Generative Models
- Computer Science, MathematicsArXiv
- 2022
The Inexact Langevin Algorithm for sampling using estimated score function when the target distribution satisfies log-Sobolev inequality (LSI) is studied, motivated by Score-based Generative Modeling (SGM), and a long-term convergence in Kullback-Leibler divergence is proved.
Improved Analysis of Score-based Generative Modeling: User-Friendly Bounds under Minimal Smoothness Assumptions
- Computer Science, MathematicsArXiv
- 2022
Under an L 2 -accurate score estimator, convergence guarantees with polynomial complexity for any data distribution with second-order moment are provided, by either employing an early stopping technique or assuming smoothness condition on the score function of the data distribution.
Statistical Efficiency of Score Matching: The View from Isoperimetry
- Computer Science, MathematicsArXiv
- 2022
This paper shows that the score matching estimator is statistically comparable to the maximum likelihood when the distribution has a small isoperimetric constant, and shows a direct parallel in the discrete setting, where it connects the statistical properties of pseudolikelihood estimation with approximate tensorization of entropy and the Glauber dynamics.
Proposal of a Score Based Approach to Sampling Using Monte Carlo Estimation of Score and Oracle Access to Target Density
- Computer ScienceArXiv
- 2022
This work considers if the authors have no initial samples from the target density, but rather 0 th and 1 st order oracle access to the log likelihood, and proposes a Monte Carlo method to estimate the score empirically as a particular expectation of a random variable.
Convergence of denoising diffusion models under the manifold hypothesis
- Mathematics, Computer ScienceArXiv
- 2022
This paper provides the first convergence results for diffusion models in this setting by providing quantitative bounds on the Wasserstein distance of order one between the target data distribution and the generative distribution of the diffusion model.
How to Trust Your Diffusion Model: A Convex Optimization Approach to Conformal Risk Control
- Computer ScienceArXiv
- 2023
This work focuses on image-to-image regression tasks and presents a generalization of the Risk-Controlling Prediction Sets procedure, that allows to provide entrywise calibrated intervals for future samples of any diffusion model, and control a certain notion of risk with respect to a ground truth image with minimal mean interval length.
Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis For DDIM-Type Samplers
- MathematicsArXiv
- 2023
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling. Several recent works have analyzed stochastic samplers using tools like Girsanov's…
Boundary Guided Mixing Trajectory for Semantic Control with Diffusion Models
- Computer ScienceArXiv
- 2023
This work achieves SOTA semantic control performance on various application settings by optimizing the denoising trajectory solely via frozen DDMs through a more comprehensive understanding of the intermediate high-dimensional latent spaces by theoretically and empirically analyzing their probabilistic and geometric behaviors in the Markov chain.
References
SHOWING 1-10 OF 36 REFERENCES
Score-Based Generative Modeling with Critically-Damped Langevin Diffusion
- Computer ScienceArXiv
- 2021
A novel critically-damped Langevin diffusion (CLD) is proposed and it is shown that CLD outperforms previous SGMs in synthesis quality for similar network architectures and sampling compute budgets, and that the novel sampler for CLD significantly outperforms solvers such as Euler–Maruyama.
Generative Modeling by Estimating Gradients of the Data Distribution
- Computer ScienceNeurIPS
- 2019
A new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching, which allows flexible model architectures, requires no sampling during training or the use of adversarial methods, and provides a learning objective that can be used for principled model comparisons.
Score-Based Generative Modeling through Stochastic Differential Equations
- Computer ScienceICLR
- 2021
This work presents a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms the prior distribution back into the data distribution by Slowly removing the noise.
Generative Modeling with Denoising Auto-Encoders and Langevin Sampling
- Computer ScienceArXiv
- 2020
It is shown that both DAE and DSM provide estimates of the score of the Gaussian smoothed population density, allowing the machinery of Empirical Processes to apply to the homotopy method of arXiv:1907.05600.
Theoretical guarantees for sampling and inference in generative models with latent diffusions
- Computer ScienceCOLT
- 2019
It is shown that one can efficiently sample from a wide class of terminal target distributions by choosing the drift of the latent diffusion from the class of multilayer feedforward neural nets, with the accuracy of sampling measured by the Kullback-Leibler divergence to the target distribution.
Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling
- Computer ScienceNeurIPS
- 2021
The supplementary presents details on the likelihood computation of generative models obtained with Schrödinger bridges and shows that the convergence rate of IPF is geometric in this case.
Generative Adversarial Nets
- Computer ScienceNIPS
- 2014
We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a…
Improved Techniques for Training Score-Based Generative Models
- Computer ScienceNeurIPS
- 2020
This work provides a new theoretical analysis of learning and sampling from score models in high dimensional spaces, explaining existing failure modes and motivating new solutions that generalize across datasets.
Sliced Score Matching: A Scalable Approach to Density and Score Estimation
- Computer ScienceUAI
- 2019
It is demonstrated that sliced score matching can learn deep energy-based models effectively, and can produce accurate score estimates for applications such as variational inference with implicit distributions and training Wasserstein Auto-Encoders.
Subspace Diffusion Generative Models
- Computer ScienceECCV
- 2022
This framework restricts the diffusion via projections onto subspaces as the data distribution evolves toward noise, which improves sample quality and reduces the computational cost of inference for the same number of denoising steps.