Non-asymptotic convergence analysis for the Unadjusted Langevin Algorithm

  title={Non-asymptotic convergence analysis for the Unadjusted Langevin Algorithm},
  author={Alain Durmus and {\'E}ric Moulines},
  journal={arXiv: Statistics Theory},
In this paper, we study a method to sample from a target distribution $\pi$ over $\mathbb{R}^d$ having a positive density with respect to the Lebesgue measure, known up to a normalisation factor. This method is based on the Euler discretization of the overdamped Langevin stochastic differential equation associated with $\pi$. For both constant and decreasing step sizes in the Euler discretization, we obtain non-asymptotic bounds for the convergence to the target distribution $\pi$ in total… 

Tables from this paper

Improved bounds for discretization of Langevin diffusions: Near-optimal rates without convexity
An improved analysis of the Euler-Maruyama discretization of the Langevin diffusion does not require global contractivity, and yields polynomial dependence on the time horizon, and simultaneously improves all those methods based on Dalayan's approach.
Mean-Square Analysis with An Application to Optimal Dimension Dependence of Langevin Monte Carlo
This work provides a general framework for the non-asymptotic analysis of sampling error in 2-Wasserstein distance, which also leads to a bound of mixing time, which is optimal in both dimension d and accuracy tolerance for target measures satisfying the aforementioned assumptions.
Analysis of Langevin Monte Carlo via Convex Optimization
It is shown that the Unadjusted Langevin Algorithm can be formulated as a first order optimization algorithm of an objective functional defined on the Wasserstein space of order $2$ and a non-asymptotic analysis of this method to sample from logconcave smooth target distribution is given.
Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations
We present a framework that allows for the non-asymptotic study of the 2-Wasserstein distance between the invariant distribution of an ergodic stochastic differential equation and the distribution of
Asymptotic bias of inexact Markov Chain Monte Carlo methods in high dimension
This paper establishes non-asymptotic bounds on Wasserstein distances between the invariant probability measures of inexact MCMC methods and their target distribution and shows that the dimension dependence relies on some key quantities.
The Forward-Backward Envelope for Sampling with the Overdamped Langevin Algorithm
In this paper, we analyse a proximal method based on the idea of forward-backward splitting for sampling from distributions with densities that are not necessarily smooth. In particular, we study the
Higher order Langevin Monte Carlo algorithm
A new Langevin Monte Carlo (LMC) algorithm with improved rates in total variation and in Wasserstein distance is presented and non-asymptotic bounds are obtained for the convergence to stationarity of the new sampling method.
Implicit Langevin Algorithms for Sampling From Log-concave Densities
This work proves geometric ergodicity and stability of the resulting methods for all step sizes and shows that obtaining subsequent samples amounts to solving a strongly-convex optimization problem, which is readily achievable using one of numerous existing methods.
Uniform minorization condition and convergence bounds for discretizations of kinetic Langevin dynamics
The convergence in total variation and V -norm of discretization schemes of the underdamped Langevin dynamics is studied to derive geometric convergence bounds and estimates on norms of solutions to Poisson equations associated with a given scheme.
Sampling from a strongly log-concave distribution with the Unadjusted Langevin Algorithm
We consider in this paper the problem of sampling a probability distribution π having a density w.r.t. the Lebesgue measure on $\mathbb{R}^d$, known up to a normalisation factor $x \mapsto


On Some non Asymptotic Bounds for the Euler Scheme
Non asymptotic bounds for the Monte Carlo algorithm associated to the Euler discretization of some diffusion processes are obtained from a Gaussian upper bound of the density of the scheme and a modification of the so-called "Herbst argument" used to prove Logarithmic Sobolev inequalities.
Rates of convergence of stochastically monotone and continuous time Markov models
In this paper we give bounds on the total variation distance from convergence of a continuous time positive recurrent Markov process on an arbitrary state space, based on Foster-Lyapunov drift and
Theoretical guarantees for approximate sampling from smooth and log‐concave densities
This work establishes non‐asymptotic bounds for the error of approximating the target distribution by the distribution obtained by the Langevin Monte Carlo method and its variants and illustrates the effectiveness of the established guarantees.
Trends to equilibrium in total variation distance
This paper presents different approaches, based on functional inequalities, to study the speed of convergence in total variation distance of ergodic diffusion processes with initial law satisfying a
The geometry of logconcave functions and sampling algorithms
These results are applied to analyze two efficient algorithms for sampling from a logconcave distribution in n dimensions, with no assumptions on the local smoothness of the density function.
Expansion of the global error for numerical schemes solving stochastic differential equations
Given the solution (Xt ) of a Stochastic Differential System, two situat,ions are considered: computat,ion of Ef(Xt ) by a Monte–Carlo method and, in the ergodic case, integration of a function f
Reflection couplings and contraction rates for diffusions
We consider contractivity for diffusion semigroups w.r.t. Kantorovich ($$L^1$$L1 Wasserstein) distances based on appropriately chosen concave functions. These distances are inbetween total variation