MCMC Using Hamiltonian Dynamics

@article{Neal2011MCMCUH,
  title={MCMC Using Hamiltonian Dynamics},
  author={Radford M. Neal},
  journal={arXiv: Computation},
  year={2011},
  pages={139-188}
}
Hamiltonian dynamics can be used to produce distant proposals for the Metropolis algorithm, thereby avoiding the slow exploration of the state space that results from the diffusive behaviour of simple random-walk proposals. Though originating in physics, Hamiltonian dynamics can be applied to most problems with continuous state spaces by simply introducing fictitious "momentum" variables. A key to its usefulness is that Hamiltonian dynamics preserves volume, and its trajectories can thus be… 

Figures from this paper

Stochastic Gradient Hamiltonian Monte Carlo
TLDR
A variant that uses second-order Langevin dynamics with a friction term that counteracts the effects of the noisy gradient, maintaining the desired target distribution as the invariant distribution is introduced.
On Irreversible Metropolis Sampling Related to Langevin Dynamics
TLDR
It is shown that as the step size tends to 0, the HAMS proposal satisfies a class of stochastic differential equations including Langevin dynamics as a special case, and theoretical results for HAMS are provided, including algebraic properties of the acceptance probability, the stationary variance, and the expected acceptance rate under a product Gaussian target distribution and the convergence rate under standard Gaussian.
Metropolis Adjusted Langevin Trajectories: a robust alternative to Hamiltonian Monte Carlo
TLDR
This work presents the Langevin diffusion as an alternative to control these ACFs by inducing randomness in Hamiltonian trajectories through a continuous refreshment of the velocities, and introduces a robust alternative to HMC built upon these dynamics, named Metropolis Adjusted Langevin Trajectories (MALT).
Connecting the Dots: Towards Continuous Time Hamiltonian Monte Carlo
Continuous time Hamiltonian Monte Carlo is introduced, as a powerful alternative to Markov chain Monte Carlo methods for continuous target distributions. The method is constructed in two steps: First
A splitting Hamiltonian Monte Carlo method for efficient sampling
TLDR
A splitting Hamiltonian Monte Carlo (SHMC) algorithm, which can be computationally efficient when combined with the random mini-batch strategy, and it is proved that the errors of the Hamiltonian induced by the random batch approximation is O ( √ ∆ t ) in the strong and O (∆ t) in the weak sense.
Hamiltonian Monte Carlo with Energy Conserving Subsampling
TLDR
This article shows that efficient subsampling HMC for the parameters is possible if both the dynamics and the acceptance probability are computed from the same data subsample in each complete HMC iteration.
Hamiltonian Dynamics with Non-Newtonian Momentum for Rapid Sampling
TLDR
The proposed Energy Sampling Hamiltonian (ESH) dynamics have a simple form that can be solved with existing ODE solvers, but they derive a specialized solver that exhibits much better performance.
Consistency and Fluctuations For Stochastic Gradient Langevin Dynamics
TLDR
This article proves that, under verifiable assumptions, the SGLD algorithm is consistent, satisfies a central limit theorem (CLT) and its asymptotic bias-variance decomposition can be characterized by an explicit functional of the step-sizes sequence (δm)m≥0.
Quasi-Newton Hamiltonian Monte Carlo
TLDR
The theoretical analysis guarantees that this dynamics remains the target distribution invariant and the proposed quasi-Newton Hamiltonian Monte Carlo (QNHMC) algorithm traverses the parameter space more efficiently than the standard HMC and produces a less correlated series of samples.
Hybrid Monte Carlo on Hilbert spaces
...
...

References

SHOWING 1-10 OF 77 REFERENCES
Riemannian Manifold Hamiltonian Monte Carlo
The paper proposes a Riemannian Manifold Hamiltonian Monte Carlo sampler to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high
Perfect sampling using bounding chains
In Monte Carlo simulation, samples are drawn from a distribution to estimate properties of the distribution that are too difficult to compute analytically. This has applications in numerous fields,
The theory of hybrid stochastic algorithms
TLDR
These lectures introduce the family of Hybrid Stochastic Algorithms for performing Monte Carlo calculations in Quantum Field Theory, and considers the Hybrid and Langevin algorithms from the viewpoint that they are approximate versions of the Hybrid Monte Carlo method.
Annealed importance sampling
TLDR
It is shown how one can use the Markov chain transitions for such an annealing sequence to define an importance sampler, which can be seen as a generalization of a recently-proposed variant of sequential importance sampling.
Probabilistic Inference Using Markov Chain Monte Carlo Methods
TLDR
The role of probabilistic inference in artificial intelligence is outlined, the theory of Markov chains is presented, and various Markov chain Monte Carlo algorithms are described, along with a number of supporting techniques.
Hamiltonian evolution for the hybrid Monte Carlo algorithm
Sampling from multimodal distributions using tempered transitions
TLDR
A new Markov chain sampling method appropriate for distributions with isolated modes that uses a series of distributions that interpolate between the distribution of interest and a distribution for which sampling is easier, with the advantage that it does not require approximate values for the normalizing constants of these distributions.
Optimal scaling of discrete approximations to Langevin diffusions
TLDR
An asymptotic diffusion limit theorem is proved and it is shown that, as a function of dimension n, the complexity of the algorithm is O(n1/3), which compares favourably with the O- complexity of random walk Metropolis algorithms.
...
...