• Corpus ID: 10688057

Gradient-free Hamiltonian Monte Carlo with Efficient Kernel Exponential Families

@inproceedings{Strathmann2015GradientfreeHM,
  title={Gradient-free Hamiltonian Monte Carlo with Efficient Kernel Exponential Families},
  author={Heiko Strathmann and D. Sejdinovic and Samuel Livingstone and Zolt{\'a}n Szab{\'o} and Arthur Gretton},
  booktitle={NIPS},
  year={2015}
}
We propose Kernel Hamiltonian Monte Carlo (KMC), a gradient-free adaptive MCMC algorithm based on Hamiltonian Monte Carlo (HMC). On target densities where classical HMC is not an option due to intractable gradients, KMC adaptively learns the target's gradient structure by fitting an exponential family model in a Reproducing Kernel Hilbert Space. Computational costs are reduced by two novel efficient approximations to this gradient. While being asymptotically exact, KMC mimics HMC in terms of… 

Figures from this paper

Modified Hamiltonian Monte Carlo for Bayesian inference

It is shown that performance of HMC can be significantly improved by incorporating importance sampling and an irreversible part of the dynamics into a chain, and is called Mix & Match Hamiltonian Monte Carlo (MMHMC).

Maximum Conditional Entropy Hamiltonian Monte Carlo Sampler

A Kolmogorov-Sinai entropy (KSE) based design criterion is proposed to optimize algorithm parameters, which can avoid some potential issues in the often used jumping-distance based measures.

Hamiltonian Monte Carlo Acceleration Using Neural Network Surrogate functions

An efficient and scalable computational technique for a state-of-the-art Markov Chain Monte Carlo methods, namely, Hamiltonian Monte Carlo (HMC), to explore and exploit the regularity in parameter space for the underlying probabilistic model to construct an effective approximation of the collective geometric and statistical properties of the whole observed data.

Kernel Sequential Monte Carlo

Kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities, is proposed, which combines the strengths of sequental Monte Carlo and kernel methods and the emulator’s ability to represent targets that exhibit high degrees of nonlinearity.

Kernel Sequential Monte

Kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities, is proposed, which combines the strengths of sequental Monte Carlo and kernel methods and the emulator’s ability to represent targets that exhibit high degrees of nonlinearity.

Pseudo-Marginal Hamiltonian Monte Carlo

An original MCMC algorithm, termed pseudo-marginal HMC, is proposed, which approximates the HMC algorithm targeting the marginal posterior of the parameters and can outperform significantly both standard HMC and pseudo- Marginal MH schemes.

Slice Sampling on Hamiltonian Trajectories

Hamiltonian slice sampling is presented, which allows slice sampling to be carried out along Hamiltonian trajectories, or transformations thereof, and offers advantages over Hamiltonian Monte Carlo, in that it has fewer tunable hyperparameters and does not require gradient information.

SpHMC: Spectral Hamiltonian Monte Carlo

This piece of work proposed a novel SGHMC sampler, namely Spectral Hamiltonian Monte Carlo (SpHMC), that produces the high dimensional sparse representations of given datasets through sparse sensing and S GHMC.

Scalable Hamiltonian Monte Carlo via Surrogate Methods

A random network surrogate architecture is proposed which can effectively capture the collective properties of large data sets or complex models with scalability, flexibility and efficiency and an approximate inference framework that combines the advantages of both variational Bayes and Markov chain Monte Carlo methods is proposed.

Mix & Match Hamiltonian Monte Carlo

It is shown that performance of HMC can be dramatically improved by incorporating importance sampling and an irreversible part of the dynamics into the chain, and the resulting generalized HMC importance sampler is called Mix & Match Hamiltonian Monte Carlo.
...

References

SHOWING 1-10 OF 34 REFERENCES

Stochastic Gradient Hamiltonian Monte Carlo

A variant that uses second-order Langevin dynamics with a friction term that counteracts the effects of the noisy gradient, maintaining the desired target distribution as the invariant distribution is introduced.

Hamiltonian Monte Carlo Acceleration Using Neural Network Surrogate functions

An efficient and scalable computational technique for a state-of-the-art Markov Chain Monte Carlo methods, namely, Hamiltonian Monte Carlo (HMC), to explore and exploit the regularity in parameter space for the underlying probabilistic model to construct an effective approximation of the collective geometric and statistical properties of the whole observed data.

Kernel Adaptive Metropolis-Hastings

K Kernel Adaptive Metropolis-Hastings outperforms competing fixed and adaptive samplers on multivariate, highly nonlinear target distributions, arising in both real-world and synthetic examples.

The Fundamental Incompatibility of Hamiltonian Monte Carlo and Data Subsampling

Leveraging the coherent exploration of Hamiltonian flow, Hamiltonian Monte Carlo produces computationally efficient Monte Carlo estimators, even with respect to complex and high-dimensional target

Optimizing The Integrator Step Size for Hamiltonian Monte Carlo

Hamiltonian Monte Carlo can provide powerful inference in complex statistical problems, but ultimately its performance is sensitive to various tuning parameters. In this paper we use the underlying

The pseudo-marginal approach for efficient Monte Carlo computations

A powerful and flexible MCMC algorithm for stochastic simulation that builds on a pseudo-marginal method, showing how algorithms which are approximations to an idealized marginal algorithm, can share the same marginal stationary distribution as the idealized method.

MCMC Using Hamiltonian Dynamics

Hamiltonian dynamics can be used to produce distant proposals for the Metropolis algorithm, thereby avoiding the slow exploration of the state space that results from the diffusive behaviour of

A tutorial on adaptive MCMC

This work proposes a series of novel adaptive algorithms which prove to be robust and reliable in practice and reviews criteria and the useful framework of stochastic approximation, which allows one to systematically optimise generally used criteria.

Adaptive proposal distribution for random walk Metropolis algorithm

Although the stationary distribution of the AP algorithm is slightly biased, it appears to provide an efficient tool for, e.g., reasonably low dimensional problems, as typically encountered in non-linear regression problems in natural sciences.

Rates of convergence of the Hastings and Metropolis algorithms

Recent results in Markov chain theory are applied to Hastings and Metropolis algorithms with either independent or symmetric candidate distributions, and it is shown geometric convergence essentially occurs if and only if $pi$ has geometric tails.