• Corpus ID: 235294278

Diffusion Schrödinger Bridge with Applications to Score-Based Generative Modeling

@inproceedings{Bortoli2021DiffusionSB,
  title={Diffusion Schr{\"o}dinger Bridge with Applications to Score-Based Generative Modeling},
  author={Valentin De Bortoli and James Thornton and Jeremy Heng and A. Doucet},
  booktitle={Neural Information Processing Systems},
  year={2021}
}
The supplementary is organized as follows. We define our notation in Section S2. In Section S3, we prove Theorem 1 and draw links between our approach of SGM and existing works. We recall the classical formulation of IPF, prove Proposition 2 and draw links with autoencoders in Section S4. In Section S5 we present alternative variational formulas for Algorithm 1 and prove Proposition 3. We gather the proofs of our theoretical study of Schrödinger bridges (Proposition 4 and Proposition 5) in… 

Conditional Simulation Using Diffusion Schrödinger Bridges (Supplementary Material)

The supplementary is organized as follows. We recall the DSB algorithm for unconditional simulation from De Bortoli et al. [2021] in Appendix B. The proofs of our propositions are given in Appendix

L IKELIHOOD T RAINING OF S CHRÖDINGER B RIDGE USING F ORWARD -B ACKWARD SDE S T HEORY

This work presents a novel computational framework for likelihood training of SB models grounded on Forward-Backward Stochastic Differential Equations Theory – a mathematical methodology appeared in stochastic optimal control that transforms the optimality condition of SB into a set of SDEs.

Maximum Likelihood Training of Implicit Nonlinear Diffusion Models

A data-adaptive and nonlinear diffusion process for score-based diffusion models that improves the learning curve of INDM to nearly Maximum Likelihood Estimation (MLE) training, against the non-MLE training of DDPM++.

Neural Lagrangian Schr\"odinger Bridge

Population dynamics is the study of temporal and spatial variation in the size of populations of organisms and is a major part of population ecology. One of the main difficulties in analyzing

Shooting Schrödinger’s Cat

This work presents a variational inference scheme to learn a model that solves the Schrödinger Bridge Problem and shows that this model is able to learn the transformation between the Gaussian distribution and arbitrary data, as well as learning dynamics that follow a potential function.

Recovering Stochastic Dynamics via Gaussian Schrödinger Bridges

We propose a new framework to reconstruct a stochastic process {Pt : t ∈ [0, T ]} using only samples from its marginal distributions, observed at start and end times 0 and T . This reconstruction is

Likelihood Training of Schrödinger Bridge using Forward-Backward SDEs Theory

This work presents a novel computational framework for likelihood training of SB models grounded on Forward-Backward Stochastic Differential Equations Theory – a mathematical methodology appeared in stochastic optimal control that transforms the optimality condition of SB into a set of SDEs.

Convergence of denoising diffusion models under the manifold hypothesis

This paper provides the first convergence results for diffusion models in this setting by providing quantitative bounds on the Wasserstein distance of order one between the target data distribution and the generative distribution of the diffusion model.

Conditional Simulation Using Diffusion Schrödinger Bridges

Denoising diffusion models have recently emerged as a powerful class of generative models. They provide state-of-the-art results, not only for unconditional simulation, but also when used to solve

Riemannian Score-Based Generative Modeling

RSGMs are introduced, a class of generative models extending SGMs to compact Riemannian manifolds and demonstrating their approach on a variety of manifolds, and in particular with earth and climate science spherical data.
...

References

SHOWING 1-10 OF 99 REFERENCES

Score-Based Generative Modeling through Stochastic Differential Equations

This work presents a stochastic differential equation (SDE) that smoothly transforms a complex data distribution to a known prior distribution by slowly injecting noise, and a corresponding reverse-time SDE that transforms the prior distribution back into the data distribution by Slowly removing the noise.

Denoising Diffusion Probabilistic Models

High quality image synthesis results are presented using diffusion probabilistic models, a class of latent variable models inspired by considerations from nonequilibrium thermodynamics, which naturally admit a progressive lossy decompression scheme that can be interpreted as a generalization of autoregressive decoding.

Improved Techniques for Training Score-Based Generative Models

This work provides a new theoretical analysis of learning and sampling from score models in high dimensional spaces, explaining existing failure modes and motivating new solutions that generalize across datasets.

Generative Modeling by Estimating Gradients of the Data Distribution

A new generative model where samples are produced via Langevin dynamics using gradients of the data distribution estimated with score matching, which allows flexible model architectures, requires no sampling during training or the use of adversarial methods, and provides a learning objective that can be used for principled model comparisons.

Statistics of Random Processes

  • M. Aschwanden
  • Mathematics
    Biomedical Measurement Systems and Data Science
  • 2021
The phenomenon of self-organized criticality (SOC) can be identified from many observations in the universe, by sampling statistical distributions of physical parameters, such as the distributions of

$I$-Divergence Geometry of Probability Distributions and Minimization Problems

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and

Analysis and Geometry of Markov Diffusion Operators

Introduction.- Part I Markov semigroups, basics and examples: 1.Markov semigroups.- 2.Model examples.- 3.General setting.- Part II Three model functional inequalities: 4.Poincare inequalities.-

The Eigenvalues of Mega-dimensional Matrices

Often, we need to know some integral property of the eigenvalues {x} of a large N × N symmetric matrix A. For example, determinants det (A) = exp(∑ log (x)) play a role in the classic maximum entropy

Neural Ordinary Differential Equations

This work shows how to scalably backpropagate through any ODE solver, without access to its internal operations, which allows end-to-end training of ODEs within larger models.

Adam: A Method for Stochastic Optimization

This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
...