Riemann manifold Langevin and Hamiltonian Monte Carlo methods

@article{Girolami2011RiemannML,
  title={Riemann manifold Langevin and Hamiltonian Monte Carlo methods},
  author={Mark A. Girolami and Ben Calderhead},
  journal={Journal of the Royal Statistical Society: Series B (Statistical Methodology)},
  year={2011},
  volume={73}
}
  • M. Girolami, B. Calderhead
  • Published 1 March 2011
  • Computer Science
  • Journal of the Royal Statistical Society: Series B (Statistical Methodology)
Summary.  The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlations. The methods provide fully automated adaptation mechanisms that circumvent the costly pilot runs that are required to tune proposal densities for Metropolis–Hastings or indeed Hamiltonian Monte Carlo and… 
Geometrically adapted Langevin dynamics for Markov chain Monte Carlo simulations
TLDR
This work makes use of concepts from differential geometry and stochastic calculus on Riemannian manifolds to geometrically adapt a Stochastic differential equation with a non-trivial drift term to arrive at a geometRically adapted Langevin dynamics.
Differential geometric MCMC methods and applications
TLDR
The methods developed provide generalisations of the Metropolis-adjusted Langevin algorithm and the Hybrid Monte Carlo algorithm for Bayesian statistical inference, and resolve many shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlation structure.
Wasserstein Control of Mirror Langevin Monte Carlo
TLDR
A non-asymptotic upper-bound is established on the sampling error of the resulting Hessian Riemannian Langevin Monte Carlo algorithm that can cope with a wide variety of Hessian metrics related to highly non-flat geometries.
Stochastic Quasi-Newton Langevin Monte Carlo
TLDR
This study proposes a novel SG-MCMC method that takes the local geometry into account by using ideas from Quasi-Newton optimization methods, and achieves fast convergence rates similar to Riemannian approaches while at the same time having low computational requirements similar to diagonal preconditioning approaches.
Robust Monte Carlo Sampling using Riemannian Nosé-Poincaré Hamiltonian Dynamics
TLDR
This work proposes dynamics based on a modified Nose-Poincare Hamiltonian augmented with Riemannian manifold corrections, and proposes a stochastic variant using additional terms in the Hamiltonian to correct for the noise from the Stochastic gradients.
Hamiltonian Monte Carlo sampling in Bayesian empirical likelihood computation
We consider Bayesian empirical likelihood estimation and develop an efficient Hamiltonian Monte Carlo method for sampling from the posterior distribution of the parameters of interest. The method
Hamiltonian Monte Carlo methods for efficient parameter estimation in steady state dynamical systems
TLDR
A novel approach for efficiently calculating the required geometric quantities by tracking steady states across the Hamiltonian trajectories using a Newton-Raphson method and employing local sensitivity information is presented.
Information Geometry and Sequential Monte Carlo
TLDR
It is demonstrated that compared to employing a standard adaptive random walk kernel, the SMC sampler with an information geometric kernel design attains a higher level of statistical robustness in the inferred parameters of the dynamical systems.
Scalable Hamiltonian Monte Carlo via Surrogate Methods
TLDR
A random network surrogate architecture is proposed which can effectively capture the collective properties of large data sets or complex models with scalability, flexibility and efficiency and an approximate inference framework that combines the advantages of both variational Bayes and Markov chain Monte Carlo methods is proposed.
A SCALED STOCHASTIC NEWTON ALGORITHM FOR MARKOV CHAIN MONTE CARLO SIMULATIONS
TLDR
The optimal scaling analysis of the proposed scaled stochastic Newton algorithm (sSN) shows that, for inhomogeneous product target distribution at stationarity, the sSN proposal variance scales like O ( n−1/3 ) for the average acceptance rate to be bounded away from zero, as the dimension n approaches infinity.
...
...

References

SHOWING 1-10 OF 275 REFERENCES
Efficient cosmological parameter estimation with Hamiltonian Monte Carlo technique
TLDR
This paper demonstrates how the HMC can be efficiently used in cosmological parameter estimation, and improves the acceptance rate by a factor of 4 and boosts up the efficiency by at least a factors of D in a D-dimensional parameter space.
Markov chain Monte Carlo posterior sampling with the Hamiltonian method
TLDR
The Hamiltonian MCMC technique in which a momentum variable is introduced for each parameter of the target pdf, showing that the efficiency of the Hamiltonian method for multidimensional isotropic Gaussian pdfs remains constant at around 7% for up to several hundred dimensions.
Particle Markov chain Monte Carlo methods
TLDR
It is shown here how it is possible to build efficient high dimensional proposal distributions by using sequential Monte Carlo methods, which allows not only to improve over standard Markov chain Monte Carlo schemes but also to make Bayesian inference feasible for a large class of statistical models where this was not previously so.
MCMC Using Hamiltonian Dynamics
Hamiltonian dynamics can be used to produce distant proposals for the Metropolis algorithm, thereby avoiding the slow exploration of the state space that results from the diffusive behaviour of
Optimal tuning of the hybrid Monte Carlo algorithm
TLDR
It is proved that, to obtain an O(1) acceptance probability as the dimension d of the state space tends to, the leapfrog step-size h should be scaled as h=l ×d−1/ 4, which means that in high dimensions, HMC requires O(d1/ 4 ) steps to traverse the statespace.
MCMC methods for diffusion bridges
TLDR
The novel algorithmic idea of the paper is that proposed moves for the MCMC algorithm are determined by discretising the SPDEs in the time direction using an implicit scheme, parametrised by θ ∈ [0,1].
Monte Carlo Sampling Methods Using Markov Chains and Their Applications
SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and
Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some fixed
...
...