# Gradient-free Hamiltonian Monte Carlo with Efficient Kernel Exponential Families

@inproceedings{Strathmann2015GradientfreeHM, title={Gradient-free Hamiltonian Monte Carlo with Efficient Kernel Exponential Families}, author={Heiko Strathmann and D. Sejdinovic and Samuel Livingstone and Zolt{\'a}n Szab{\'o} and Arthur Gretton}, booktitle={NIPS}, year={2015} }

We propose Kernel Hamiltonian Monte Carlo (KMC), a gradient-free adaptive MCMC algorithm based on Hamiltonian Monte Carlo (HMC). On target densities where classical HMC is not an option due to intractable gradients, KMC adaptively learns the target's gradient structure by fitting an exponential family model in a Reproducing Kernel Hilbert Space. Computational costs are reduced by two novel efficient approximations to this gradient. While being asymptotically exact, KMC mimics HMC in terms of…

## 68 Citations

### Modified Hamiltonian Monte Carlo for Bayesian inference

- Computer ScienceStat. Comput.
- 2020

It is shown that performance of HMC can be significantly improved by incorporating importance sampling and an irreversible part of the dynamics into a chain, and is called Mix & Match Hamiltonian Monte Carlo (MMHMC).

### Maximum Conditional Entropy Hamiltonian Monte Carlo Sampler

- Computer ScienceSIAM J. Sci. Comput.
- 2021

A Kolmogorov-Sinai entropy (KSE) based design criterion is proposed to optimize algorithm parameters, which can avoid some potential issues in the often used jumping-distance based measures.

### Hamiltonian Monte Carlo Acceleration Using Neural Network Surrogate functions

- Computer Science
- 2015

An efficient and scalable computational technique for a state-of-the-art Markov Chain Monte Carlo methods, namely, Hamiltonian Monte Carlo (HMC), to explore and exploit the regularity in parameter space for the underlying probabilistic model to construct an effective approximation of the collective geometric and statistical properties of the whole observed data.

### Kernel Sequential Monte Carlo

- Computer ScienceECML/PKDD
- 2017

Kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities, is proposed, which combines the strengths of sequental Monte Carlo and kernel methods and the emulator’s ability to represent targets that exhibit high degrees of nonlinearity.

### Kernel Sequential Monte

- Computer Science
- 2017

Kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities, is proposed, which combines the strengths of sequental Monte Carlo and kernel methods and the emulator’s ability to represent targets that exhibit high degrees of nonlinearity.

### Maximizing conditional entropy of Hamiltonian Monte Carlo sampler

- Computer Science
- 2019

A conditional entropy based design criterion is proposed to optimize the integration time of Hamiltonian Monte Carlo sampler for near-Gaussian distributions and is able to derive the optimal integration time with respect to the conditional entropy criterion analytically.

### Slice Sampling on Hamiltonian Trajectories

- Mathematics, PhysicsICML
- 2016

Hamiltonian slice sampling is presented, which allows slice sampling to be carried out along Hamiltonian trajectories, or transformations thereof, and offers advantages over Hamiltonian Monte Carlo, in that it has fewer tunable hyperparameters and does not require gradient information.

### Probabilistic Path Hamiltonian Monte Carlo

- MathematicsICML
- 2017

Probabilistic Path HMC (PPHMC) is developed as a first step to sampling distributions on spaces with intricate combinatorial structure, and a surrogate function to ease the transition across a boundary on which the log-posterior has discontinuous derivatives can greatly improve efficiency.

### SpHMC: Spectral Hamiltonian Monte Carlo

- Computer ScienceAAAI
- 2019

This piece of work proposed a novel SGHMC sampler, namely Spectral Hamiltonian Monte Carlo (SpHMC), that produces the high dimensional sparse representations of given datasets through sparse sensing and S GHMC.

### Scalable Hamiltonian Monte Carlo via Surrogate Methods

- Computer Science
- 2016

A random network surrogate architecture is proposed which can effectively capture the collective properties of large data sets or complex models with scalability, flexibility and efficiency and an approximate inference framework that combines the advantages of both variational Bayes and Markov chain Monte Carlo methods is proposed.

## References

SHOWING 1-10 OF 34 REFERENCES

### Hamiltonian ABC

- Computer ScienceUAI
- 2015

This work introduces Hamiltonian ABC (HABC), a set of likelihood-free algorithms that apply recent advances in scaling Bayesian learning using Hamiltonian Monte Carlo (HMC) and stochastic gradients, and finds that a small number forward simulations can effectively approximate the ABC gradient.

### Stochastic Gradient Hamiltonian Monte Carlo

- Computer ScienceICML
- 2014

A variant that uses second-order Langevin dynamics with a friction term that counteracts the effects of the noisy gradient, maintaining the desired target distribution as the invariant distribution is introduced.

### Hamiltonian Monte Carlo Acceleration Using Neural Network Surrogate functions

- Computer Science
- 2015

An efficient and scalable computational technique for a state-of-the-art Markov Chain Monte Carlo methods, namely, Hamiltonian Monte Carlo (HMC), to explore and exploit the regularity in parameter space for the underlying probabilistic model to construct an effective approximation of the collective geometric and statistical properties of the whole observed data.

### Kernel Adaptive Metropolis-Hastings

- Computer ScienceICML
- 2014

K Kernel Adaptive Metropolis-Hastings outperforms competing fixed and adaptive samplers on multivariate, highly nonlinear target distributions, arising in both real-world and synthetic examples.

### The Fundamental Incompatibility of Hamiltonian Monte Carlo and Data Subsampling

- Physics
- 2015

Leveraging the coherent exploration of Hamiltonian flow, Hamiltonian Monte Carlo produces computationally efficient Monte Carlo estimators, even with respect to complex and high-dimensional target…

### Optimizing The Integrator Step Size for Hamiltonian Monte Carlo

- Physics
- 2014

Hamiltonian Monte Carlo can provide powerful inference in complex statistical problems, but ultimately its performance is sensitive to various tuning parameters. In this paper we use the underlying…

### The pseudo-marginal approach for efficient Monte Carlo computations

- Computer Science
- 2009

A powerful and flexible MCMC algorithm for stochastic simulation that builds on a pseudo-marginal method, showing how algorithms which are approximations to an idealized marginal algorithm, can share the same marginal stationary distribution as the idealized method.

### Gaussian Processes to Speed up Hybrid Monte Carlo for Expensive Bayesian Integrals

- Computer Science
- 2003

This work proposes to use a Gaussian Process model of the (log of the) posterior for most of the computations required by HMC, allowing Bayesian treatment of models with posteriors that are computationally demanding, such as models involving computer simulation.

### A tutorial on adaptive MCMC

- Computer ScienceStat. Comput.
- 2008

This work proposes a series of novel adaptive algorithms which prove to be robust and reliable in practice and reviews criteria and the useful framework of stochastic approximation, which allows one to systematically optimise generally used criteria.

### Adaptive proposal distribution for random walk Metropolis algorithm

- Computer ScienceComput. Stat.
- 1999

Although the stationary distribution of the AP algorithm is slightly biased, it appears to provide an efficient tool for, e.g., reasonably low dimensional problems, as typically encountered in non-linear regression problems in natural sciences.