Elements of Sequential Monte Carlo

@article{Naesseth2019ElementsOS,
  title={Elements of Sequential Monte Carlo},
  author={C. A. Naesseth and F. Lindsten and Thomas Bo Sch{\"o}n},
  journal={ArXiv},
  year={2019},
  volume={abs/1903.04797}
}
A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations. This is the fundamental problem of Bayesian statistics and machine learning, which frames all inference as expectations with respect to the posterior distribution. The key challenge is to approximate these intractable expectations. In this tutorial, we review sequential Monte Carlo (SMC), a random-sampling-based class of methods for approximate inference. First, we explain… Expand
A Sequential Marginal Likelihood Approximation Using Stochastic Gradients
Existing algorithms like nested sampling and annealed importance sampling are able to produce accurate estimates of the marginal likelihood of a model, but tend to scale poorly to large data sets.Expand
Stochastic Gradient Annealed Importance Sampling for Efficient Online Marginal Likelihood Estimation †
TLDR
The resulting stochastic gradient annealed importance sampling (SGAIS) technique enables us to estimate the marginal likelihood of a number of models considerably faster than traditional approaches, with no noticeable loss of accuracy. Expand
Variational Combinatorial Sequential Monte Carlo Methods for Bayesian Phylogenetic Inference
TLDR
Variational Combinatorial Sequential Monte Carlo (VCSMC) is introduced, a powerful framework that establishes variational sequential search to learn distributions over intricate combinatorial structures and is used to define a second objective, VNCSMC, which yields tighter lower bounds than VCSMC. Expand
Integrals over Gaussians under Linear Domain Constraints
TLDR
An efficient black-box algorithm that exploits geometry for the estimation of integrals over a small, truncated Gaussian volume, and to simulate therefrom, using the Holmes-Diaconis-Ross (HDR) method combined with an analytic version of elliptical slice sampling (ESS). Expand
Correctness of Sequential Monte Carlo Inference for Probabilistic Programming Languages
TLDR
A correctness proof for SMC methods in the context of an expressive PPL calculus, representative of popular PPLs such as WebPPL, Anglican, and Birch is given and an untyped PPL lambda calculus and operational semantics are extended to include explicit resample terms. Expand
Nested Variational Inference
TLDR
NVI is developed, a family of methods that learn proposals for nested importance samplers by minimizing an forward or reverse KL divergence at each level of nesting, and it is observed that optimizing nested objectives leads to improved sample quality in terms of log average weight and effective sample size. Expand
Variational Objectives for Markovian Dynamics with Backward Simulation
TLDR
Particle Smoothing Variational Objectives (SVO) is introduced, a novel backward simulation technique and variational objective constructed from a smoothed approximate posterior that consistently outperforms filtered objectives when given fewer Monte Carlo samples. Expand
Universal probabilistic programming offers a powerful approach to statistical phylogenetics
TLDR
This work develops automated generation of sequential Monte Carlo algorithms for PPL descriptions of arbitrary biological diversification (birth-death) models, and shows that few hurdles remain before these techniques can be effectively applied to the full range of phylogenetic models. Expand
Ensemble Kalman Variational Objectives: Nonlinear Latent Trajectory Inference with A Hybrid of Variational Inference and Ensemble Kalman Filter
TLDR
It is demonstrated that EnKOs outperform the SMC based methods in terms of predictive ability for three benchmark nonlinear dynamics systems tasks and can identify the latent dynamics given fewer particles because of its rich particle diversity. Expand
Exploring Probability Measures with Markov Processes
TLDR
A transparent characterisation of how one can construct a PDMP (within the class of trajectorially-reversible processes) which admits the desired invariant measure is developed, and actionable recommendations on how these processes should be designed in practice are offered. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 123 REFERENCES
Variational Sequential Monte Carlo
TLDR
The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters, and is demonstrated its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits. Expand
Variational Inference: A Review for Statisticians
TLDR
Variational inference (VI), a method from machine learning that approximates probability densities through optimization, is reviewed and a variant that uses stochastic optimization to scale up to massive data is derived. Expand
CONVERGENCE OF SEQUENTIAL MONTE CARLO-BASED SAMPLING METHODS
Originally designed for state-space models, Sequential Monte Carlo (SMC) methods are now routinely applied in the context of general-purpose Bayesian inference. Traditional analyses of SMC algorithmsExpand
Controlled sequential Monte Carlo
Sequential Monte Carlo methods, also known as particle methods, are a popular set of techniques for approximating high-dimensional probability distributions and their normalizing constants. TheseExpand
Graphical model inference: Sequential Monte Carlo meets deterministic approximations
TLDR
This paper suggests an efficient sequential Monte Carlo (SMC) algorithm for PGMs which can leverage the output from deterministic inference methods and can be viewed as a post-correction of the biases associated with these methods. Expand
Inference Networks for Sequential Monte Carlo in Graphical Models
TLDR
A procedure for constructing and learning a structured neural network which represents an inverse factorization of the graphical model, resulting in a conditional density estimator that takes as input particular values of the observed random variables, and returns an approximation to the distribution of the latent variables. Expand
Measuring the reliability of MCMC inference with bidirectional Monte Carlo
TLDR
This work extends the recently introduced bidirectional Monte Carlo technique to evaluate MCMC-based posterior inference algorithms and presents Bounding Divergences with REverse Annealing (BREAD), a protocol for validating the relevance of simulated data experiments to real datasets, and integrates it into two probabilistic programming languages: WebPPL and Stan. Expand
Divide-and-Conquer With Sequential Monte Carlo
TLDR
A novel class of Sequential Monte Carlo algorithms appropriate for inference in probabilistic graphical models that adopts a divide-and-conquer approach based upon an auxiliary tree-structured decomposition of the model of interest, turning the overall inferential task into a collection of recursively solved subproblems. Expand
Inference for dynamic and latent variable models via iterated, perturbed Bayes maps
TLDR
A new theoretical framework for iterated filtering is developed and an algorithm supported by this theory displays substantial numerical improvement on the computational challenge of inferring parameters of a partially observed Markov process. Expand
Sequential Monte Carlo as Approximate Sampling: bounds, adaptive resampling via $\infty$-ESS, and an application to Particle Gibbs
Sequential Monte Carlo (SMC) algorithms were originally designed for estimating intractable conditional expectations within state-space models, but are now routinely used to generate approximateExpand
...
1
2
3
4
5
...