• Corpus ID: 236924591

The No-U-Turn Sampler as a Proposal Distribution in a Sequential Monte Carlo Sampler with a Near-Optimal L-Kernel

@inproceedings{Devlin2021TheNS,
  title={The No-U-Turn Sampler as a Proposal Distribution in a Sequential Monte Carlo Sampler with a Near-Optimal L-Kernel},
  author={L. J. Devlin and Paul R. Horridge and Peter L. Green and Simon Maskell},
  year={2021}
}
Markov Chain Monte Carlo (MCMC) is a powerful method for drawing samples from non-standard probability distributions and is utilized across many fields and disciplines. Methods such as Metropolis-Adjusted Langevin (MALA) and Hamiltonian Monte Carlo (HMC), which use gradient information to explore the target distribution, are popular variants of MCMC. The Sequential Monte Carlo (SMC) sampler is an alternative sampling method which, unlike MCMC, can readily utilise parallel computing… 
1 Citations

Figures from this paper

Robust Dynamic Multi-Modal Data Fusion: A Model Uncertainty Perspective

  • Bin Liu
  • Computer Science
    IEEE Signal Processing Letters
  • 2021
This letter is concerned with multi-modal data fusion (MMDF) under unexpected modality failures in nonlinear non-Gaussian dynamic processes. An efficient framework to tackle this problem is proposed.

References

SHOWING 1-10 OF 18 REFERENCES

The No-U-turn sampler: adaptively setting path lengths in Hamiltonian Monte Carlo

TLDR
The No-U-Turn Sampler (NUTS), an extension to HMC that eliminates the need to set a number of steps L, and derives a method for adapting the step size parameter {\epsilon} on the fly based on primal-dual averaging.

Efficient Sequential Monte-Carlo Samplers for Bayesian Inference

TLDR
An automatic and adaptive strategy that selects the sequence of distributions within the SMC sampler that minimizes the asymptotic variance of the estimator of the posterior normalization constant is proposed.

Sequential Monte-Carlo sampler for Bayesian inference in complex systems

TLDR
This thesis proposes an automatic and adaptive strategy that selects the sequence of distributions within the SMC sampler that approximately minimizes the asymptotic variance of the estimator of the posterior normalization constant.

Improving SMC sampler estimate by recycling all past simulated particles

TLDR
This paper proposes a recycling scheme of all past simulated particles in the SMC sampler in order to reduce the variance of the final estimator and demonstrates how the proposed approach outperforms the classical strategy in two challenging models.

Handbook of Markov Chain Monte Carlo

TLDR
A Markov chain Monte Carlo based analysis of a multilevel model for functional MRI data and its applications in environmental epidemiology, educational research, and fisheries science are studied.

Sequential Monte Carlo samplers

TLDR
A methodology to sample sequentially from a sequence of probability distributions that are defined on a common space, each distribution being known up to a normalizing constant is proposed.

MCMC Using Hamiltonian Dynamics

Hamiltonian dynamics can be used to produce distant proposals for the Metropolis algorithm, thereby avoiding the slow exploration of the state space that results from the diffusive behaviour of

Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy

TLDR
The basics of Bayesian theory are explained and how to set up data analysis problems within this framework are discussed, and an overview of various Monte Carlo based methods for performing Bayesian data analysis is provided.

Regression Shrinkage and Selection via the Lasso

TLDR
A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

Langevin Diffusions and Metropolis-Hastings Algorithms

We consider a class of Langevin diffusions with state-dependent volatility. The volatility of the diffusion is chosen so as to make the stationary distribution of the diffusion with respect to its