# Adaptive independent sticky MCMC algorithms

@article{Martino2018AdaptiveIS, title={Adaptive independent sticky MCMC algorithms}, author={Luca Martino and Roberto Casarin and Fabrizio Leisen and David Luengo}, journal={EURASIP Journal on Advances in Signal Processing}, year={2018}, volume={2018}, pages={1-28} }

Monte Carlo methods have become essential tools to solve complex Bayesian inference problems in different fields, such as computational statistics, machine learning, and statistical signal processing. In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky Markov Chain Monte Carlo (MCMC) algorithms, to sample efficiently from any bounded target probability density function (pdf). The new class of algorithms employs adaptive non-parametric… Expand

#### Figures, Tables, and Topics from this paper

#### 17 Citations

The Recycling Gibbs sampler for efficient learning

- Mathematics, Computer Science
- Digit. Signal Process.
- 2018

This work shows that auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost, and gives empirical evidence of performance in a toy example, inference of Gaussian processes hyperparameters, and learning dependence graphs through regression. Expand

Adaptive Incremental Mixture Markov Chain Monte Carlo

- Mathematics, Medicine
- Journal of computational and graphical statistics : a joint publication of American Statistical Association, Institute of Mathematical Statistics, Interface Foundation of North America
- 2019

Theoretically, it is proved that there exists a stochastic process that can be made arbitrarily close to AIMM and that converges to the correct target distribution, and it is illustrated that it performs well in practice in a variety of challenging situations, including high-dimensional and multimodal target distributions. Expand

Iterative Construction of Gaussian Process Surrogate Models for Bayesian Inference

- Computer Science, Mathematics
- ArXiv
- 2019

The algorithm aims to mitigate some of the hurdles faced by traditional Markov Chain Monte Carlo samplers, through constructing proposal probability densities that are both, easy to sample and that provide a better approximation to the target density than a simple Gaussian proposal distribution would. Expand

A survey of Monte Carlo methods for parameter estimation

- Mathematics, Computer Science
- EURASIP J. Adv. Signal Process.
- 2020

A thorough review of MC methods for the estimation of static parameters in signal processing applications is performed, describing many of the most relevant MCMC and IS algorithms, and their combined use. Expand

An adaptive multiple-try Metropolis algorithm

- 2021

Markov chain Monte Carlo (MCMC) methods, specifically samplers based on random walks, often have difficulty handling target distributions with complex geometry such as multi-modality. We propose an… Expand

MCMC adaptatifs à essais multiples

- 2019

This memoir aims at introducing adaptation within the Multiple-Try Metropolis (MTM) algorithms which are a special case of the Markov chain Monte Carlo (MCMC) methods. The MCMC methods, along with… Expand

Adaptive Rejection Sampling Methods

- Computer Science
- 2018

This chapter is devoted to describing the class of the adaptive rejection sampling (ARS) schemes, which are very efficient samplers that update the proposal density whenever a generated sample is rejected in the RS test. Expand

A new adaptive approach of the Metropolis-Hastings algorithm applied to structural damage identification using time domain data

- Computer Science
- 2020

A new adaptive MH algorithm (P-AMH) is proposed based on the Bayesian inference, a powerful approach that has been widely used for the formulation of inverse problems in a statistical framework, and shows that both adaptive algorithms outperformed the conventional MH. Expand

Adaptive rejection sampling with fixed number of nodes

- Computer Science, Mathematics
- Commun. Stat. Simul. Comput.
- 2019

This work proposes a novel ARS scheme, called Cheap Adaptive Rejection Sampling (CARS), where the computational effort for drawing from the proposal remains constant, decided in advance by the user. Expand

Marginal likelihood computation for model selection and hypothesis testing: an extensive review

- Computer Science, Mathematics
- ArXiv
- 2020

This article provides a comprehensive study of the state-of-the-art of marginal likelihood computation for model selection and hypothesis testing, highlighting limitations, benefits, connections and differences among the different techniques. Expand

#### References

SHOWING 1-10 OF 113 REFERENCES

Sticky proposal densities for adaptive MCMC methods

- Computer Science
- 2016 IEEE Statistical Signal Processing Workshop (SSP)
- 2016

This paper focuses on adaptive Markov chain MC (MCMC) algorithms, introducing a novel class of adaptive proposal functions that progressively “stick” to the target, thus being able to generate virtually independent samples after a few iterations. Expand

Independent Doubly Adaptive Rejection Metropolis Sampling Within Gibbs Sampling

- Computer Science
- IEEE Transactions on Signal Processing
- 2015

An alternative adaptive MCMC algorithm (IA2RMS) is proposed that overcomes an important drawback of the Adaptive Rejection Metropolis Sampling technique, speeding up the convergence of the chain to the target, allowing us to simplify the construction of the sequence of proposals, and thus reducing the computational cost of the entire algorithm. Expand

Generalized rejection sampling schemes and applications in signal processing

- Mathematics, Computer Science
- Signal Process.
- 2010

The proposed GARS method yields a sequence of proposal densities that converge towards the target pdf and enable a very efficient sampling of a broad class of probability distributions, possibly with multiple modes and non-standard forms. Expand

Metropolis–Hastings algorithms with adaptive proposals

- Computer Science, Mathematics
- Stat. Comput.
- 2008

Two novel algorithms based on Metropolis–Hastings-within-Gibbs sampling using mixtures of triangular and trapezoidal densities are presented as improved versions of the all-purpose adaptive rejection Metropolis sampling (ARMS) algorithm to sample from non-logconcave univariate densities. Expand

A fast universal self-tuned sampler within Gibbs sampling

- Computer Science, Mathematics
- Digit. Signal Process.
- 2015

This work presents a simple, self-tuned and extremely efficient MCMC algorithm which produces virtually independent samples from these univariate target densities, and named the newly proposed approach as FUSS (Fast Universal Self- Tuned Sampler), as it can be used to sample from any bounded univariate distribution and also from any bound multi-variate distribution, either directly or by embedding it within a Gibbs sampler. Expand

Annealing Markov chain Monte Carlo with applications to ancestral inference

- Mathematics
- 1995

Abstract Markov chain Monte Carlo (MCMC; the Metropolis-Hastings algorithm) has been used for many statistical problems, including Bayesian inference, likelihood inference, and tests of significance.… Expand

Controlled MCMC for Optimal Sampling

- Mathematics
- 2001

In this paper we develop an original and general framework for automatically optimizing the statistical properties of Markov chain Monte Carlo (MCMC) samples, which are typically used to evaluate… Expand

Adaptive independent Metropolis–Hastings

- Mathematics
- 2009

We propose an adaptive independent Metropolis--Hastings algorithm with the ability to learn from all previous proposals in the chain except the current location. It is an extension of the independent… Expand

Population-Based Reversible Jump Markov Chain Monte Carlo

- Computer Science, Mathematics
- 2007

An extension of population-based Markov chain Monte Carlo to the transdimensional case is presented, and a result proving the uniform ergodicity of these population algorithms is used to demonstrate the superiority, in terms of convergence rate, of a population transition kernel over a reversible jump sampler for a Bayesian variable selection problem. Expand

Fast Gibbs sampling for high-dimensional Bayesian inversion

- Mathematics
- 2016

Solving ill-posed inverse problems by Bayesian inference has recently attracted considerable attention. Compared to deterministic approaches, the probabilistic representation of the solution by the… Expand