Inference from Iterative Simulation Using Multiple Sequences

  title={Inference from Iterative Simulation Using Multiple Sequences},
  author={Andrew Gelman and Donald B. Rubin},
  journal={Statistical Science},
The Gibbs sampler, the algorithm of Metropolis and similar iterative simulation methods are potentially very helpful for summarizing multivariate distributions. Used naively, however, iterative simulation can give misleading answers. Our methods are simple and generally applicable to the output of any iterative simulation; they are designed for researchers primarily interested in the science underlying the data and models they are analyzing, rather than for researchers interested in the… 

Figures and Tables from this paper

A computational procedure for estimation of the mixing time of the random-scan Metropolis algorithm

The work presented here gives a computational method of approximately verifying a drift condition and a minorization condition specifically for the symmetric random-scan Metropolis algorithm.

Annealing Markov chain Monte Carlo with applications to ancestral inference

This work proposes MCMC methods distantly related to simulated annealing, which simulate realizations from a sequence of distributions, allowing the distribution being simulated to vary randomly over time.

A simulation approach to convergence rates for Markov chain Monte Carlo algorithms

This paper proposes the use of auxiliary simulations to estimate the numerical values needed in this theorem and makes it possible to compute quantitative convergence bounds for models for which the requisite analytical computations would be prohibitively difficult or impossible.

Bayesian inference for pairwise interacting point processes

Bayesian methods for obtaining inferences in pairwise interacting point processes are proposed and the use of importance sampling techniques within Markov chain Monte Carlo techniques within MCMC are proposed.

Advanced Bayesian Computation

A full Bayesian analysis is performed on a biological data set from Gelfand et al. (1990) and the key concepts and the computational tools discussed in this chapter are demonstrated in this section.

An Introduction to Bayesian Inference via Variational Approximations

This paper demonstrates how variational approximations can be used to facilitate the application of Bayesian models to political science data, including models to describe legislative voting blocs and statistical models for political texts.

Approximate Bayesian-inference With the Weighted Likelihood Bootstrap

We introduce the weighted likelihood bootstrap (WLB) as a way to simulate approximately from a posterior distribution. This method is often easy to implement, requiring only an algorithm for

Discovering Inductive Bias with Gibbs Priors: A Diagnostic Tool for Approximate Bayesian Inference

A new approach to diagnosing approximate inference: the approximation mismatch is at-tributed to a change in the inductive bias by treating the approximations as exact and reverse-engineering the corresponding prior by reframing the problem in terms of incompatible conditional distributions.

Posterior explorations of Markov chains in Bayesian analysis of discrete finite mixture models

  • P. Saama
  • Mathematics, Computer Science
  • 1999
This paper examines the performance of an L distance convergence diagnostic which assesses the convergence of the joint density of a Gibbs Sampler algorithm in a discrete finite mixture model and shows that this diagnostic has advantages over many existing convergence diagnostics in terms of consistency, applicability, computational expense, and interpretability.



How Many Iterations in the Gibbs Sampler

Abstract : When the Gibbs sampler is used to estimate posterior distributions (Gelfand and Smith, 1990) the question of how many iterations are required is central to its implementation. When

Monte Carlo Sampling Methods Using Markov Chains and Their Applications

SUMMARY A generalization of the sampling method introduced by Metropolis et al. (1953) is presented along with an exposition of the relevant theory, techniques of application and methods and

Spatial Statistics and Bayesian Computation

The early development of MCMC in Bayesian inference is traced, some recent computational progress in statistical physics is reviewed, based on the introduction of auxiliary variables, and its current and future relevance in Bayesesian applications are discussed.

Sampling-Based Approaches to Calculating Marginal Densities

Abstract Stochastic substitution, the Gibbs sampler, and the sampling-importance-resampling algorithm can be viewed as three alternative sampling- (or Monte Carlo-) based approaches to the

Exploring Posterior Distributions Using Markov Chains

Abstract : Several Markov chain-based methods are available for sampling from a posterior distribution. Two important examples are the Gibbs sampler and the Metropolis algorithm. In addition, several

Metropolis Methods, Gaussian Proposals and Antithetic Variables

We investigate various aspects of a class of dynamic Monte Carlo methods, that generalises the Metropolis algorithm and includes the Gibbs sampler as a special case. These can be used to estimate

Evaluating the accuracy of sampling-based approaches to the calculation of posterior moments

Methods for spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence in the normal linear model with informative priors, and in the Tobit-censored regression model.

Accurate Approximations for Posterior Moments and Marginal Densities

These approximations to the posterior means and variances of positive functions of a real or vector-valued parameter, and to the marginal posterior densities of arbitrary parameters can also be used to compute approximate predictive densities.

Using EM to Obtain Asymptotic Variance-Covariance Matrices: The SEM Algorithm

This article defines and illustrates a procedure that obtains numerically stable asymptotic variance–covariance matrices using only the code for computing the complete-data variance-covarance matrix, the code of the expectation maximization algorithm, and code for standard matrix operations.

Illustration of Bayesian Inference in Normal Data Models Using Gibbs Sampling

Abstract The use of the Gibbs sampler as a method for calculating Bayesian marginal posterior and predictive densities is reviewed and illustrated with a range of normal data models, including