• Corpus ID: 55043743

Roll-back Hamiltonian Monte Carlo

@article{Yi2017RollbackHM,
  title={Roll-back Hamiltonian Monte Carlo},
  author={Kexin Yi and Finale Doshi-Velez},
  journal={arXiv: Machine Learning},
  year={2017}
}
We propose a new framework for Hamiltonian Monte Carlo (HMC) on truncated probability distributions with smooth underlying density functions. Traditional HMC requires computing the gradient of potential function associated with the target distribution, and therefore does not perform its full power on truncated distributions due to lack of continuity and differentiability. In our framework, we introduce a sharp sigmoid factor in the density function to approximate the probability drop at the… 

Figures and Tables from this paper

Modified Hamiltonian Monte Carlo for Bayesian inference
TLDR
It is shown that performance of HMC can be significantly improved by incorporating importance sampling and an irreversible part of the dynamics into a chain, and is called Mix & Match Hamiltonian Monte Carlo (MMHMC).
Truncated Log-concave Sampling with Reflective Hamiltonian Monte Carlo
We introduce Reflective Hamiltonian Monte Carlo (ReHMC), an HMC-based algorithm, to sample from a log-concave distribution restricted to a convex polytope. We prove that, starting from a warm start,
On Using Hamiltonian Monte Carlo Sampling for Reinforcement Learning Problems in High-dimension
TLDR
A framework, called Hamiltonian Q -Learning, is introduced that demonstrates, both theoretically and empirically, that Q values can be learned from a dataset generated by HMC samples of actions, rewards, and state transitions, and broadens the scope of RL algorithms for real-world applications.
Hamiltonian Q-Learning: Leveraging Importance-sampling for Data Efficient RL
TLDR
Hamiltonian Q-Learning is introduced, a data efficient modification of the Q-learning approach, which adopts an importance-sampling based technique for computing the Q function and exploits the latent low-rank structure of the dynamic system.
PoRB-Nets: Poisson Process Radial Basis Function Networks
TLDR
A novel prior over radial basis function networks (RBFNs) is presented that allows for independent specification of functional amplitude variance and lengthscale (i.e., smoothness), where the inverse lengthscale corresponds to the concentration of radial basis functions.
LF-PPL: A Low-Level First Order Probabilistic Programming Language for Non-Differentiable Models
TLDR
A new Low-level, First-order Probabilistic Programming Language~ (LF-PPL) suited for models containing a mix of continuous, discrete, and/or piecewise-continuous variables is developed, backed up by a mathematical formalism that ensures that any model expressed in this language has a density with measure zero discontinuities to maintain the validity of the inference engine.
Markov Chain Monte Carlo Methods for Estimating Systemic Risk Allocations
In this paper, we propose a novel framework for estimating systemic risk measures and risk allocations based on Markov Chain Monte Carlo (MCMC) methods. We consider a class of allocations whose jth

References

SHOWING 1-10 OF 18 REFERENCES
Exact Hamiltonian Monte Carlo for Truncated Multivariate Gaussians
We present a Hamiltonian Monte Carlo algorithm to sample from multivariate Gaussian distributions in which the target space is constrained by linear and quadratic inequalities or products thereof.
Spherical Hamiltonian Monte Carlo for Constrained Target Distributions
TLDR
A novel Markov Chain Monte Carlo (MCMC) method is proposed that provides a general and computationally efficient framework for handling boundary conditions inStatistical models with constrained probability distributions including truncated Gaussian, Bayesian Lasso, Bayes bridge regression, and a copula model for identifying synchrony among multiple neurons.
MCMC Using Hamiltonian Dynamics
Hamiltonian dynamics can be used to produce distant proposals for the Metropolis algorithm, thereby avoiding the slow exploration of the state space that results from the diffusive behaviour of
Auxiliary-variable Exact Hamiltonian Monte Carlo Samplers for Binary Distributions
We present a new approach to sample from generic binary distributions, based on an exact Hamiltonian Monte Carlo algorithm applied to a piecewise continuous augmentation of the binary distribution of
The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo
TLDR
The No-U-Turn Sampler (NUTS), an extension to HMC that eliminates the need to set a number of steps L, and derives a method for adapting the step size parameter {\epsilon} on the fly based on primal-dual averaging.
Introduction to Monte Carlo Methods
Monte Carlo methods play an important role in scientific computation, especially when problems have a vast phase space. In this lecture an introduction to the Monte Carlo method is given. Concepts
A Conceptual Introduction to Hamiltonian Monte Carlo
TLDR
This review provides a comprehensive conceptual account of these theoretical foundations of Hamiltonian Monte Carlo, focusing on developing a principled intuition behind the method and its optimal implementations rather of any exhaustive rigor.
Bayesian Non-negative Matrix Factorization
TLDR
An iterated conditional modes algorithm is presented that rivals existing state-of-the-art NMF algorithms on an image feature extraction problem and discusses how the Gibbs sampler can be used for model order selection by estimating the marginal likelihood.
Handbook of Markov Chain Monte Carlo
TLDR
A Markov chain Monte Carlo based analysis of a multilevel model for functional MRI data and its applications in environmental epidemiology, educational research, and fisheries science are studied.
Stan: A Probabilistic Programming Language
TLDR
Stan is a probabilistic programming language for specifying statistical models that provides full Bayesian inference for continuous-variable models through Markov chain Monte Carlo methods such as the No-U-Turn sampler and an adaptive form of Hamiltonian Monte Carlo sampling.
...
...