Comparison of asymptotic variances of inhomogeneous Markov chains with application to Markov chain Monte Carlo methods
@article{Maire2013ComparisonOA, title={Comparison of asymptotic variances of inhomogeneous Markov chains with application to Markov chain Monte Carlo methods}, author={Florian Maire and Randal Douc and Jimmy Olsson}, journal={arXiv: Methodology}, year={2013} }
In this paper, we study the asymptotic variance of sample path averages for inhomogeneous Markov chains that evolve alternatingly according to two different $\pi$-reversible Markov transition kernels $P$ and $Q$. More specifically, our main result allows us to compare directly the asymptotic variances of two inhomogeneous Markov chains associated with different kernels $P_i$ and $Q_i$, $i\in\{0,1\}$, as soon as the kernels of each pair $(P_0,P_1)$ and $(Q_0,Q_1)$ can be ordered in the sense of…
30 Citations
On the use of Markov chain Monte Carlo methods for the sampling of mixture models: a statistical perspective
- Computer ScienceStatistics and Computing
- 2014
A novel algorithm is proposed, the Frozen Carlin & Chib sampler, which is computationally less demanding than any Metropolised Carlin& Chib-type algorithm, however obtained at the cost of some asymptotic variance.
On the use of Markov chain Monte Carlo methods for the sampling of mixture models: a statistical perspective
- Computer Science, MathematicsStat. Comput.
- 2015
A novel algorithm is proposed, the Frozen Carlin & Chib sampler, which is computationally less demanding than any Metropolised Carlin& Chib-type algorithm, however obtained at the cost of some asymptotic variance.
Importance sampling correction versus standard averages of reversible MCMCs in terms of the asymptotic variance
- Mathematics
- 2020
Stability of noisy Metropolis –
- Mathematics
- 2015
Pseudo-marginal Markov chain Monte Carlo methods for sampling from intractable distributions have gained recent interest and have been theoretically studied in considerable depth. Their main appeal…
Bayesian learning of weakly structural Markov graph laws using sequential Monte Carlo methods
- Computer Science, MathematicsElectronic Journal of Statistics
- 2019
It is shown that the problem of graph estimation can be recast into a sequential setting by proposing a recursive Feynman-Kac model that generates a flow of junction tree distributions over a space of increasing dimensions.
Stability of noisy Metropolis–Hastings
- MathematicsStat. Comput.
- 2016
A further characterisation of the noisy algorithm, initially conceptualised as Monte Carlo within Metropolis, with a focus on fundamental stability properties like positive recurrence and geometric ergodicity is provided.
On Markov chain Monte Carlo for sparse and filamentary distributions
- Computer Science, Mathematics
- 2018
This paper finds that for a specific class of target distribution, referred to as sparse and filamentary, that exhibits a strong correlation between some variables and/or which concentrates its probability mass on some low dimensional linear subspaces or on thinned curved manifolds, a locally informed strategy converges substantially faster and yields smaller asymptotic variances than an equivalent random-scan algorithm.
Peskun-Tierney ordering for Markov chain and process Monte Carlo: beyond the reversible scenario
- Mathematics
- 2019
Historically time-reversibility of the transitions or processes underpinning Markov chain Monte Carlo methods (MCMC) has played a key r\^ole in their development, while the self-adjointness of…
Establishing some order amongst exact approximations of MCMCs
- Computer Science, Mathematics
- 2014
A general framework which allows one to compare, or order, performance measures of two implementations of Markov chain Monte Carlo algorithms is discovered, and an order with respect to the mean acceptance probability, the first autocorrelation coefficient, the asymptotic variance and the right spectral gap is established.
A Hybrid Scan Gibbs Sampler for Bayesian Models with Latent Variables
- Computer ScienceStatistical Science
- 2021
It is shown that, under weak regularity conditions, adding sandwich steps to the HS Gibbs sampler always results in a theoretically superior algorithm.
References
SHOWING 1-10 OF 21 REFERENCES
Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms
- Mathematics
- 2015
We study convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms (Andrieu and Roberts [Ann. Statist. 37 (2009) 697-725]). We find that the asymptotic variance of the…
General state space Markov chains and MCMC algorithms
- Mathematics, Computer Science
- 2004
This paper surveys various results about Markov chains on gen- eral (non-countable) state spaces. It begins with an introduction to Markov chain Monte Carlo (MCMC) algorithms, which provide the…
Monte Carlo Sampling Methods Using Markov Chains and Their Applications
- Mathematics
- 1970
SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and…
Optimum Monte-Carlo sampling using Markov chains
- Mathematics
- 1973
SUMMARY The sampling method proposed by Metropolis et al. (1953) requires the simulation of a Markov chain with a specified 7i as its stationary distribution. Hastings (1970) outlined a general…
The Multiple-Try Method and Local Optimization in Metropolis Sampling
- Computer Science
- 2000
A new Metropolis-like transition rule, the multiple-try Metropolis, for Markov chain Monte Carlo (MCMC) simulations is described and a novel method for incorporating local optimization steps into a MCMC sampler in continuous state-space is proposed.
Coupled MCMC with a randomized acceptance probability
- Computer Science
- 2012
A class of MCMC algorithms characterized by a pseudo-marginal transition probability kernel is given and it is shown that on average O(m) of the samples realized by a simulation approximating a randomized chain of length n are exactly the same as those of a coupled (exact) randomized chain.
A generalization of the Multiple-try Metropolis algorithm for Bayesian estimation and model selection
- Computer ScienceAISTATS
- 2010
This work proposes a generalization of the Multipletry Metropolis algorithm, based on drawing several proposals at each step and randomly choosing one of them on the basis of weights that may be arbitrary chosen, and introduces a method based on weights depending on a quadratic approximation of the posterior distribution for Bayesian estimation.
The pseudo-marginal approach for efficient Monte Carlo computations
- Computer Science
- 2009
A powerful and flexible MCMC algorithm for stochastic simulation that builds on a pseudo-marginal method, showing how algorithms which are approximations to an idealized marginal algorithm, can share the same marginal stationary distribution as the idealized method.
Constructing summary statistics for approximate Bayesian computation: semi‐automatic approximate Bayesian computation
- Computer Science
- 2012
This work shows how to construct appropriate summary statistics for ABC in a semi‐automatic manner, and shows that optimal summary statistics are the posterior means of the parameters.
Estimation of population growth or decline in genetically monitored populations.
- MathematicsGenetics
- 2003
A new general method is introduced that samples independent genealogical histories using importance sampling (IS) and then samples other parameters with Markov chain Monte Carlo (MCMC) and it is concluded that these have an approximately equivalent effect.