Adaptive independent sticky MCMC algorithms

@article{Martino2018AdaptiveIS,
  title={Adaptive independent sticky MCMC algorithms},
  author={Luca Martino and Roberto Casarin and Fabrizio Leisen and David Luengo},
  journal={EURASIP Journal on Advances in Signal Processing},
  year={2018},
  volume={2018},
  pages={1-28}
}
Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in different fields, such as computational statistics, machine learning, and statistical signal processing. In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky Markov Chain Monte Carlo (MCMC) algorithms, to sample efficiently from any bounded target probability density function (pdf). The new class of algorithms employs adaptive non-parametric… Expand
The Recycling Gibbs sampler for efficient learning
TLDR
This work shows that auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost, and gives empirical evidence of performance in a toy example, inference of Gaussian processes hyperparameters, and learning dependence graphs through regression. Expand
Adaptive Incremental Mixture Markov Chain Monte Carlo
  • F. Maire, N. Friel, A. Mira, A. Raftery
  • Mathematics, Medicine
  • Journal of computational and graphical statistics : a joint publication of American Statistical Association, Institute of Mathematical Statistics, Interface Foundation of North America
  • 2019
TLDR
Theoretically, it is proved that there exists a stochastic process that can be made arbitrarily close to AIMM and that converges to the correct target distribution, and it is illustrated that it performs well in practice in a variety of challenging situations, including high-dimensional and multimodal target distributions. Expand
Iterative Construction of Gaussian Process Surrogate Models for Bayesian Inference
TLDR
The algorithm aims to mitigate some of the hurdles faced by traditional Markov Chain Monte Carlo samplers, through constructing proposal probability densities that are both, easy to sample and that provide a better approximation to the target density than a simple Gaussian proposal distribution would. Expand
A survey of Monte Carlo methods for parameter estimation
TLDR
A thorough review of MC methods for the estimation of static parameters in signal processing applications is performed, describing many of the most relevant MCMC and IS algorithms, and their combined use. Expand
An adaptive multiple-try Metropolis algorithm
Markov chain Monte Carlo (MCMC) methods, specifically samplers based on random walks, often have difficulty handling target distributions with complex geometry such as multi-modality. We propose anExpand
MCMC adaptatifs à essais multiples
This memoir aims at introducing adaptation within the Multiple-Try Metropolis (MTM) algorithms which are a special case of the Markov chain Monte Carlo (MCMC) methods. The MCMC methods, along withExpand
Adaptive Rejection Sampling Methods
TLDR
This chapter is devoted to describing the class of the adaptive rejection sampling (ARS) schemes, which are very efficient samplers that update the proposal density whenever a generated sample is rejected in the RS test. Expand
A new adaptive approach of the Metropolis-Hastings algorithm applied to structural damage identification using time domain data
TLDR
A new adaptive MH algorithm (P-AMH) is proposed based on the Bayesian inference, a powerful approach that has been widely used for the formulation of inverse problems in a statistical framework, and shows that both adaptive algorithms outperformed the conventional MH. Expand
Adaptive rejection sampling with fixed number of nodes
TLDR
This work proposes a novel ARS scheme, called Cheap Adaptive Rejection Sampling (CARS), where the computational effort for drawing from the proposal remains constant, decided in advance by the user. Expand
Marginal likelihood computation for model selection and hypothesis testing: an extensive review
TLDR
This article provides a comprehensive study of the state-of-the-art of marginal likelihood computation for model selection and hypothesis testing, highlighting limitations, benefits, connections and differences among the different techniques. Expand
...
1
2
...

References

SHOWING 1-10 OF 113 REFERENCES
Sticky proposal densities for adaptive MCMC methods
TLDR
This paper focuses on adaptive Markov chain MC (MCMC) algorithms, introducing a novel class of adaptive proposal functions that progressively “stick” to the target, thus being able to generate virtually independent samples after a few iterations. Expand
Independent Doubly Adaptive Rejection Metropolis Sampling Within Gibbs Sampling
TLDR
An alternative adaptive MCMC algorithm (IA2RMS) is proposed that overcomes an important drawback of the Adaptive Rejection Metropolis Sampling technique, speeding up the convergence of the chain to the target, allowing us to simplify the construction of the sequence of proposals, and thus reducing the computational cost of the entire algorithm. Expand
Generalized rejection sampling schemes and applications in signal processing
TLDR
The proposed GARS method yields a sequence of proposal densities that converge towards the target pdf and enable a very efficient sampling of a broad class of probability distributions, possibly with multiple modes and non-standard forms. Expand
Metropolis–Hastings algorithms with adaptive proposals
TLDR
Two novel algorithms based on Metropolis–Hastings-within-Gibbs sampling using mixtures of triangular and trapezoidal densities are presented as improved versions of the all-purpose adaptive rejection Metropolis sampling (ARMS) algorithm to sample from non-logconcave univariate densities. Expand
A fast universal self-tuned sampler within Gibbs sampling
TLDR
This work presents a simple, self-tuned and extremely efficient MCMC algorithm which produces virtually independent samples from these univariate target densities, and named the newly proposed approach as FUSS (Fast Universal Self- Tuned Sampler), as it can be used to sample from any bounded univariate distribution and also from any bound multi-variate distribution, either directly or by embedding it within a Gibbs sampler. Expand
Annealing Markov chain Monte Carlo with applications to ancestral inference
Abstract Markov chain Monte Carlo (MCMC; the Metropolis-Hastings algorithm) has been used for many statistical problems, including Bayesian inference, likelihood inference, and tests of significance.Expand
Controlled MCMC for Optimal Sampling
In this paper we develop an original and general framework for automatically optimizing the statistical properties of Markov chain Monte Carlo (MCMC) samples, which are typically used to evaluateExpand
Adaptive independent Metropolis–Hastings
We propose an adaptive independent Metropolis--Hastings algorithm with the ability to learn from all previous proposals in the chain except the current location. It is an extension of the independentExpand
Population-Based Reversible Jump Markov Chain Monte Carlo
TLDR
An extension of population-based Markov chain Monte Carlo to the transdimensional case is presented, and a result proving the uniform ergodicity of these population algorithms is used to demonstrate the superiority, in terms of convergence rate, of a population transition kernel over a reversible jump sampler for a Bayesian variable selection problem. Expand
Fast Gibbs sampling for high-dimensional Bayesian inversion
Solving ill-posed inverse problems by Bayesian inference has recently attracted considerable attention. Compared to deterministic approaches, the probabilistic representation of the solution by theExpand
...
1
2
3
4
5
...