Optimal tuning of the hybrid Monte Carlo algorithm
@article{Beskos2010OptimalTO, title={Optimal tuning of the hybrid Monte Carlo algorithm}, author={Alexandros Beskos and Natesh S. Pillai and Gareth O. Roberts and Jes{\'u}s Mar{\'i}a Sanz-Serna and Andrew M. Stuart}, journal={Bernoulli}, year={2010}, volume={19}, pages={1501-1534} }
We investigate the properties of the Hybrid Monte Carlo algorithm (HMC) in high dimensions.
HMC develops a Markov chain reversible w.r.t. a given target distribution . by using separable Hamiltonian dynamics with potential -log .. The additional momentum variables are chosen at random from the Boltzmann distribution and the continuous-time Hamiltonian dynamics are then discretised using the leapfrog scheme. The induced bias is removed via a Metropolis-
Hastings accept/reject rule. In the…
285 Citations
Entropy-based adaptive Hamiltonian Monte Carlo
- Computer ScienceNeurIPS
- 2021
A gradient-based algorithm is developed that allows for the adaptation of the mass matrix of Hamiltonian Monte Carlo by encouraging the leapfrog integrator to have high acceptance rates while also exploring all dimensions jointly.
Metropolis Adjusted Langevin Trajectories: a robust alternative to Hamiltonian Monte Carlo
- Computer Science
- 2022
This work presents the Langevin diffusion as an alternative to control these ACFs by inducing randomness in Hamiltonian trajectories through a continuous refreshment of the velocities, and introduces a robust alternative to HMC built upon these dynamics, named Metropolis Adjusted Langevin Trajectories (MALT).
Compressible generalized hybrid Monte Carlo.
- Computer ScienceThe Journal of chemical physics
- 2014
This work presents a general framework for constructing hybrid Monte Carlo methods under relaxed conditions: the only geometric property needed is (weakened) reversibility; volume preservation is not needed.
Fast mixing of Metropolized Hamiltonian Monte Carlo: Benefits of multi-step gradients
- Mathematics, Computer ScienceJ. Mach. Learn. Res.
- 2020
This work provides a non-asymptotic upper bound on the mixing time of the Metropolized HMC with explicit choices of stepsize and number of leapfrog steps, and provides a general framework for sharpening mixing time bounds Markov chains initialized at a substantial distance from the target distribution over continuous spaces.
Adaptive Tuning for Metropolis Adjusted Langevin Trajectories
- Computer Science
- 2022
This work builds upon recent strategies for tuning the hyperparameters of RHMC which target a bound on the Effective Sample Size (ESS) and adapt it to MALT, thereby enabling the first user-friendly deployment of this algorithm.
Optimal scaling for the transient phase of Metropolis Hastings algorithms: The longtime behavior
- Mathematics
- 2014
We consider the Random Walk Metropolis algorithm on $\R^n$ with Gaussian proposals, and when the target probability measure is the $n$-fold product of a one dimensional law. It is well-known (see…
Sampling from multimodal distributions using tempered Hamiltonian transitions
- Computer Science
- 2021
This paper develops a Hamiltonian Monte Carlo method where the constructed paths can travel across high potential energy barriers and can construct globally mixing Markov chains targeting high-dimensional, multimodal distributions, using mixtures of normals and a sensor network localization problem.
A splitting Hamiltonian Monte Carlo method for efficient sampling
- Computer ScienceCSIAM Transactions on Applied Mathematics
- 2023
A splitting Hamiltonian Monte Carlo (SHMC) algorithm, which can be computationally efficient when combined with the random mini-batch strategy, and it is proved that the errors of the Hamiltonian induced by the random batch approximation is O ( √ ∆ t ) in the strong and O (∆ t) in the weak sense.
Variable length trajectory compressible hybrid Monte Carlo
- Computer Science
- 2016
A framework to further extend HMC to a situation in which the dynamics is reversible but not necessarily Hamiltonian and enables an effective application of variable step size integrators to HMC-type sampling algorithms based on reversible dynamics.
References
SHOWING 1-10 OF 54 REFERENCES
Optimal Scaling and Diffusion Limits for the Langevin Algorithm in High Dimensions
- Computer Science, Mathematics
- 2011
It is proved that, started in stationarity, a suitably interpolated and scaled version of the Markov chain corresponding to MALA converges to an infinite dimensional diffusion process.
Riemannian Manifold Hamiltonian Monte Carlo
- Computer Science
- 2009
The paper proposes a Riemannian Manifold Hamiltonian Monte Carlo sampler to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high…
Riemann manifold Langevin and Hamiltonian Monte Carlo methods
- Computer ScienceJournal of the Royal Statistical Society: Series B (Statistical Methodology)
- 2011
The methodology proposed automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density, and substantial improvements in the time‐normalized effective sample size are reported when compared with alternative sampling approaches.
A comparison of generalized hybrid Monte Carlo methods with and without momentum flip
- PhysicsJ. Comput. Phys.
- 2009
Shadow hybrid Monte Carlo: an efficient propagator in phase space of macromolecules
- Computer Science
- 2004
MCMC methods for diffusion bridges
- Computer Science
- 2008
The novel algorithmic idea of the paper is that proposed moves for the MCMC algorithm are determined by discretising the SPDEs in the time direction using an implicit scheme, parametrised by θ ∈ [0,1].
Optimal scalings for local Metropolis--Hastings chains on nonproduct targets in high dimensions
- Mathematics, Computer Science
- 2009
We investigate local MCMC algorithms, namely the random-walk Metropolis and the Langevin algorithms, and identify the optimal choice of the local step-size as a function of the dimension $n$ of the…