Spectral gaps for a Metropolis–Hastings algorithm in infinite dimensions

@article{Hairer2014SpectralGF,
  title={Spectral gaps for a Metropolis–Hastings algorithm in infinite dimensions},
  author={Martin Hairer and Andrew M. Stuart and Sebastian J. Vollmer},
  journal={Annals of Applied Probability},
  year={2014},
  volume={24},
  pages={2455-2490}
}
We study the problem of sampling high and infinite dimensional target measures arising in applications such as conditioned diffusions and inverse problems. We focus on those that arise from approximating measures on Hilbert spaces defined via a density with respect to a Gaussian reference measure. We consider the Metropolis–Hastings algorithm that adds an accept–reject mechanism to a Markov chain proposal in order to make the chain reversible with respect to the target measure. We focus on… Expand

Figures from this paper

Error bounds for Metropolis–Hastings algorithms applied to perturbations of Gaussian measures in high dimensions
The Metropolis-adjusted Langevin algorithm (MALA) is a Metropolis-Hastings method for approximate sampling from continuous distributions. We derive upper bounds for the contraction rate inExpand
Metropolis-Hastings algorithms for perturbations of Gaussian measures in high dimensions: Contraction properties and error bounds in the logconcave case
The Metropolis-adjusted Langevin algorithm (MALA) is a Metropolis-Hastings method for approximate sampling from continuous distributions. We derive upper bounds for the contraction rate inExpand
Mixing Rates for Hamiltonian Monte Carlo Algorithms in Finite and Infinite Dimensions
We establish the geometric ergodicity of the preconditioned Hamiltonian Monte Carlo (HMC) algorithm defined on an infinite-dimensional Hilbert space, as developed in [Beskos et al., StochasticExpand
Algorithms for Kullback-Leibler Approximation of Probability Measures in Infinite Dimensions
TLDR
This paper introduces a computational algorithm which is well-adapted to the required minimization, seeking to find the mean as a function, and parameterizing the covariance in two different ways: through low rank perturbations of the reference covariance and through Schrodinger potential perturbation of the inversereference covariance. Expand
Spectral gaps and error estimates for infinite-dimensional Metropolis-Hastings with non-Gaussian priors
We study a class of Metropolis-Hastings algorithms for target measures that are absolutely continuous with respect to a large class of non-Gaussian prior measures on Banach spaces. The algorithm isExpand
Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm
TLDR
The upper bound proof introduces a new technique based on a projection characterization of the Metropolis adjustment which reduces the study of MALA to the well-studied discretization analysis of the Langevin SDE and bypasses direct computation of the acceptance probability. Expand
Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors
TLDR
A simple recipe is provided to obtain dimension-independent bounds on the Monte-Carlo error of MCMC sampling for Gaussian prior measures for non-Gaussian prior Measures with dimension independent performance. Expand
On the accept-reject mechanism for Metropolis-Hastings algorithms
This work develops a powerful and versatile framework for determining acceptance ratios in Metropolis-Hastings type Markov kernels widely used in statistical sampling problems. Our approach allows usExpand
Consistency and Fluctuations For Stochastic Gradient Langevin Dynamics
TLDR
This article proves that, under verifiable assumptions, the SGLD algorithm is consistent, satisfies a central limit theorem (CLT) and its asymptotic bias-variance decomposition can be characterized by an explicit functional of the step-sizes sequence (δm)m≥0. Expand
MCMC Methods for Functions: ModifyingOld Algorithms to Make Them Faster
Many problems arising in applications result in the need to probe a probability distribution for functions. Examples include Bayesian nonparametric statistics and conditioned diffusion processes.Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 71 REFERENCES
Error bounds for Metropolis–Hastings algorithms applied to perturbations of Gaussian measures in high dimensions
The Metropolis-adjusted Langevin algorithm (MALA) is a Metropolis-Hastings method for approximate sampling from continuous distributions. We derive upper bounds for the contraction rate inExpand
Hybrid Monte Carlo on Hilbert spaces
The Hybrid Monte Carlo (HMC) algorithm provides a framework for sampling from complex, high-dimensional target distributions. In contrast with standard Markov chain Monte Carlo (MCMC) algorithms, itExpand
Diffusion limits of the random walk metropolis algorithm in high dimensions
Diffusion limits of MCMC methods in high dimensions provide a useful theoretical tool for studying computational complexity. In particular, they lead directly to precise estimates of the number ofExpand
Optimal scalings for local Metropolis--Hastings chains on nonproduct targets in high dimensions
We investigate local MCMC algorithms, namely the random-walk Metropolis and the Langevin algorithms, and identify the optimal choice of the local step-size as a function of the dimension $n$ of theExpand
Dimension-Independent MCMC Sampling for Inverse Problems with Non-Gaussian Priors
TLDR
A simple recipe is provided to obtain dimension-independent bounds on the Monte-Carlo error of MCMC sampling for Gaussian prior measures for non-Gaussian prior Measures with dimension independent performance. Expand
Geometric convergence and central limit theorems for multidimensional Hastings and Metropolis algorithms
We develop results on geometric ergodicity of Markov chains and apply these and other recent results in Markov chain theory to multidimensional Hastings and Metropolis algorithms. For those based onExpand
Advanced MCMC methods for sampling on diffusion pathspace
TLDR
An advanced version of familiar Markov-chain Monte-Carlo algorithms that sample from target distributions defined as change of measures from Gaussian laws on general Hilbert spaces are studied. Expand
MCMC Methods for Functions: ModifyingOld Algorithms to Make Them Faster
Many problems arising in applications result in the need to probe a probability distribution for functions. Examples include Bayesian nonparametric statistics and conditioned diffusion processes.Expand
MCMC methods for diffusion bridges
We present and study a Langevin MCMC approach for sampling nonlinear diffusion bridges. The method is based on recent theory concerning stochastic partial differential equations (SPDEs) reversibleExpand
Asymptotic coupling and a general form of Harris’ theorem with applications to stochastic delay equations
There are many Markov chains on infinite dimensional spaces whose one-step transition kernels are mutually singular when starting from different initial conditions. We give results which prove uniqueExpand
...
1
2
3
4
5
...