Exponential convergence of Langevin distributions and their discrete approximations

@article{Roberts1996ExponentialCO,
  title={Exponential convergence of Langevin distributions and their discrete approximations},
  author={Gareth O. Roberts and Richard L. Tweedie},
  journal={Bernoulli},
  year={1996},
  volume={2},
  pages={341-363}
}
In this paper we consider a continuous-time method of approximating a given distribution using the Langevin di€usion dLtˆdWt‡ 1 2 r log (Lt)dt. We ®nd conditions under this di€usion converges exponentially quickly to or does not: in one dimension, these are essentially that for distributions with exponential tails of the form (x)/ exp (y |x| , 0< <1, exponential convergence occurs if and only if 1. We then consider conditions under which the discrete approximations to the di€usion converge. We… 
Exponential ergodicity of the bouncy particle sampler
Non-reversible Markov chain Monte Carlo schemes based on piecewise deterministic Markov processes have been recently introduced in applied probability, automatic control, physics and statistics.
Metropolis-Hastings algorithms for perturbations of Gaussian measures in high dimensions: Contraction properties and error bounds in the logconcave case
TLDR
Upper bounds for the contraction rate in Kantorovich-Rubinstein-Wasserstein distance of the MALA chain with semi-implicit Euler proposals applied to log-concave probability measures that have a density w.r.t. a Gaussian reference measure are derived.
Convergence Rates for Langevin Monte Carlo in the Nonconvex Setting
TLDR
Surprisingly, the iteration complexity for both overdamped and underdamped Langevin MCMC is only polynomial in the dimension d and the target accuracy ε, however, it is exponential in the problem parameter LR, which is a measure of non-log-concavity of the target distribution.
On sampling from a log-concave density using kinetic Langevin diffusions
TLDR
It is proved that the geometric mixing property of the kinetic Langevin diffusion with a mixing rate that is, in the overdamped regime, optimal in terms of its dependence on the condition number is optimal.
Continuous-time Random Walks for the Numerical Solution of Stochastic Differential Equations
This paper introduces time-continuous numerical schemes to simulate stochastic differential equations (SDEs) arising in mathematical finance, population dynamics, chemical kinetics, epidemiology,
Error bounds for Metropolis–Hastings algorithms applied to perturbations of Gaussian measures in high dimensions
TLDR
Upper bounds for the contraction rate in Kantorovich-Rubinstein-Wasserstein distance of the MALA chain with semi-implicit Euler proposals applied to log-concave probability measures that have a density w.r.t. a Gaussian reference measure are derived.
Mean-Field Langevin Dynamics: Exponential Convergence and Annealing
TLDR
This work studies the annealed dynamics, and shows that for a noise decaying at a logarithmic rate, the dynamics converges in value to the global minimizer of the unregularized objective function.
Wasserstein distance estimates for the distributions of numerical approximations to ergodic stochastic differential equations
We present a framework that allows for the non-asymptotic study of the 2-Wasserstein distance between the invariant distribution of an ergodic stochastic differential equation and the distribution of
Stochastic Runge-Kutta Accelerates Langevin Monte Carlo and Beyond
TLDR
The convergence rate of sampling algorithms obtained by discretizing smooth Ito diffusions exhibiting fast Wasserstein-$2$ contraction is established, based on local deviation properties of the integration scheme.
Asynchronous Distributed Gibbs Sampling (Preprint Version 0.1)
TLDR
This work presents a novel scheme that allows us to approximate any integral in a parallel fashion with no synchronization or locking, avoiding the typical performance bottlenecks of parallel algorithms.
...
...

References

SHOWING 1-10 OF 40 REFERENCES
Rates of convergence of the Hastings and Metropolis algorithms
TLDR
Recent results in Markov chain theory are applied to Hastings and Metropolis algorithms with either independent or symmetric candidate distributions, and it is shown geometric convergence essentially occurs if and only if $pi$ has geometric tails.
Optimal scaling of discrete approximations to Langevin diffusions
TLDR
An asymptotic diffusion limit theorem is proved and it is shown that, as a function of dimension n, the complexity of the algorithm is O(n1/3), which compares favourably with the O- complexity of random walk Metropolis algorithms.
Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms
We develop results on geometric ergodicity of Markov chains and apply these and other recent results in Markov chain theory to multidimensional Hastings and Metropolis algorithms. For those based on
Stability of Markovian processes II: continuous-time processes and sampled chains
In this paper we extend the results of Meyn and Tweedie (1992b) from discrete-time parameter to continuous-parameter Markovian processes Φ evolving on a topological space. We consider a number of
Monte Carlo Sampling Methods Using Markov Chains and Their Applications
SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and
Topological conditions enabling use of harris methods in discrete and continuous time
AbstractThis paper describes the role of continuous components in linking the topological and measuretheoretic (or regenerative) analysis of Markov chains and processes. Under Condition
Markov Chains and Stochastic Stability
TLDR
This second edition reflects the same discipline and style that marked out the original and helped it to become a classic: proofs are rigorous and concise, the range of applications is broad and knowledgeable, and key ideas are accessible to practitioners with limited mathematical background.
Spatial Statistics and Bayesian Computation
TLDR
The early development of MCMC in Bayesian inference is traced, some recent computational progress in statistical physics is reviewed, based on the introduction of auxiliary variables, and its current and future relevance in Bayesesian applications are discussed.
Shift-coupling and convergence rates of ergodic averages
We study convergence of Markov chains to their stationary distributions . Much recent work has used coupling to get quantitative bounds on the total variation distance between the law and . In this
Markov Chains for Exploring Posterior Distributions
Several Markov chain methods are available for sampling from a posterior distribution. Two important examples are the Gibbs sampler and the Metropolis algorithm. In addition, several strategies are
...
...