Sequential Monte Carlo samplers

@article{Moral2002SequentialMC,
  title={Sequential Monte Carlo samplers},
  author={Pierre Del Moral and A. Doucet},
  journal={Journal of the Royal Statistical Society: Series B (Statistical Methodology)},
  year={2002},
  volume={68}
}
  • P. MoralA. Doucet
  • Published 31 December 2002
  • Computer Science, Mathematics
  • Journal of the Royal Statistical Society: Series B (Statistical Methodology)
Summary.  We propose a methodology to sample sequentially from a sequence of probability distributions that are defined on a common space, each distribution being known up to a normalizing constant. These probability distributions are approximated by a cloud of weighted random samples which are propagated over time by using sequential Monte Carlo methods. This methodology allows us to derive simple algorithms to make parallel Markov chain Monte Carlo algorithms interact to perform global… 

Sequential Monte Carlo with transformations

This paper uses sequential Monte Carlo samplers for performing Bayesian inference sequentially on a sequence of posteriors on spaces of different dimensions to yield an extremely flexible and general algorithm for Bayesian model comparison that is suitable for use in applications where the acceptance rate in reversible jump Markov chain Monte Carlo is low.

Sequentially interacting Markov chain Monte Carlo methods

The proposed Sequentially Interacting Markov Chain Monte Carlo scheme works by generating interacting non-Markovian sequences which behave asymptotically like independent Metropolis-Hastings Markov chains with the desired limiting distributions.

An Invitation to Sequential Monte Carlo Samplers

This article describes sequential Monte Carlo samplers and their possible implementations, arguing that they remain under-used in statistics, despite their ability to perform sequential inference and to leverage parallel processing resources among other potential benefits.

Sequential Monte Carlo for Graphical Models

A new framework for how to use sequential Monte Carlo (SMC) algorithms for inference in probabilistic graphical models (PGM) is proposed and one of the key merits of the SMC sampler is that it provides an unbiased estimate of the partition function of the model.

Sequential Markov Chain Monte Carlo

A sequential Markov chain Monte Carlo (SMCMC) algorithm to sample from a sequence of probability distributions, corresponding to posterior distributions at different times in on-line applications, which has advantages over sequential Monte Carlo in avoiding particle degeneracy issues.

Convergence of Monte Carlo distribution estimates from rival samplers

This article addresses how to optimally divide sampling effort between the samplers of the different distributions, and proposes a new Monte Carlo divergence error criterion based on Jensen–Shannon divergence.

A sequential Monte Carlo approach to computing tail probabilities in stochastic models

It is shown how resampling weights can be chosen to yield logarithmically ecient Monte Carlo estimates of large deviation probabilities for multidimensional Markov random walks.

Monte Carlo convergence of rival samplers

This article addresses how to optimally divide sampling effort between the samplers of the different distributions, and proposes a new Monte Carlo divergence error criterion based on Jensen-Shannon divergence.

Finite Sample Complexity of Sequential Monte Carlo Estimators

Borders for the complexity of sequential Monte Carlo approximations for a variety of target distributions including finite spaces, product measures, and log-concave distributions including Bayesian logistic regression are obtained.

On sequential Monte Carlo, partial rejection control and approximate Bayesian computation

It is proved that the new sampler can reduce the variance of the incremental importance weights when compared with standard sequential Monte Carlo samplers, and provide a central limit theorem.
...

References

SHOWING 1-10 OF 49 REFERENCES

On sequential Monte Carlo sampling methods for Bayesian filtering

An overview of methods for sequential simulation from posterior distributions for discrete time dynamic models that are typically nonlinear and non-Gaussian, and how to incorporate local linearisation methods similar to those which have previously been employed in the deterministic filtering literature are shown.

Reversible jump Markov chain Monte Carlo computation and Bayesian model determination

Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some fixed

Trans-dimensional Markov chain Monte Carlo

Summary In the context of sample-based computation of Bayesian posterior distributions in complex stochastic systems, this chapter discusses some of the uses for a Markov chain with a prescribed

Annealing Markov chain Monte Carlo with applications to ancestral inference

This work proposes MCMC methods distantly related to simulated annealing, which simulate realizations from a sequence of distributions, allowing the distribution being simulated to vary randomly over time.

Sequential Monte Carlo methods for dynamic systems

A general framework for using Monte Carlo methods in dynamic systems and a general use of Rao-Blackwellization is proposed to improve performance and to compare different Monte Carlo procedures.

Annealed importance sampling

It is shown how one can use the Markov chain transitions for such an annealing sequence to define an importance sampler, which can be seen as a generalization of a recently-proposed variant of sequential importance sampling.

Following a moving target—Monte Carlo inference for dynamic Bayesian models

This work proposes a new technique for tracking moving target distributions, known as particle filters, which does not suffer from a progressive degeneration as the target sequence evolves.

Sequential importance sampling for nonparametric Bayes models: The next generation

There are two generations of Gibbs sampling methods for semiparametric models involving the Dirichlet process. The first generation suffered from a severe drawback: the locations of the clusters, or

Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference

The term sequential Monte Carlo methods or, equivalently, particle filters, refers to a general class of iterative algorithms that performs Monte Carlo approximations of a given sequence of

Population Monte Carlo

The population Monte Carlo principle is described, which consists of iterated generations of importance samples, with importance functions depending on the previously generated importance sample, which can be iterated like MCMC algorithms, while being more robust against dependence and starting values, as shown in this paper.