# Elements of Sequential Monte Carlo

@article{Naesseth2019ElementsOS,
title={Elements of Sequential Monte Carlo},
author={C. A. Naesseth and F. Lindsten and Thomas Bo Sch{\"o}n},
journal={ArXiv},
year={2019},
volume={abs/1903.04797}
}
• Published 2019
• Mathematics, Computer Science
• ArXiv
A core problem in statistics and probabilistic machine learning is to compute probability distributions and expectations. This is the fundamental problem of Bayesian statistics and machine learning, which frames all inference as expectations with respect to the posterior distribution. The key challenge is to approximate these intractable expectations. In this tutorial, we review sequential Monte Carlo (SMC), a random-sampling-based class of methods for approximate inference. First, we explain… Expand
A Sequential Marginal Likelihood Approximation Using Stochastic Gradients
• Mathematics
• 2019
Existing algorithms like nested sampling and annealed importance sampling are able to produce accurate estimates of the marginal likelihood of a model, but tend to scale poorly to large data sets.Expand
Stochastic Gradient Annealed Importance Sampling for Efficient Online Marginal Likelihood Estimation †
• Mathematics, Computer Science
• Entropy
• 2019
The resulting stochastic gradient annealed importance sampling (SGAIS) technique enables us to estimate the marginal likelihood of a number of models considerably faster than traditional approaches, with no noticeable loss of accuracy. Expand
Variational Combinatorial Sequential Monte Carlo Methods for Bayesian Phylogenetic Inference
• Computer Science, Mathematics
• ArXiv
• 2021
Variational Combinatorial Sequential Monte Carlo (VCSMC) is introduced, a powerful framework that establishes variational sequential search to learn distributions over intricate combinatorial structures and is used to define a second objective, VNCSMC, which yields tighter lower bounds than VCSMC. Expand
Integrals over Gaussians under Linear Domain Constraints
• Computer Science, Mathematics
• AISTATS
• 2020
An efficient black-box algorithm that exploits geometry for the estimation of integrals over a small, truncated Gaussian volume, and to simulate therefrom, using the Holmes-Diaconis-Ross (HDR) method combined with an analytic version of elliptical slice sampling (ESS). Expand
Correctness of Sequential Monte Carlo Inference for Probabilistic Programming Languages
• Computer Science
• ESOP
• 2021
A correctness proof for SMC methods in the context of an expressive PPL calculus, representative of popular PPLs such as WebPPL, Anglican, and Birch is given and an untyped PPL lambda calculus and operational semantics are extended to include explicit resample terms. Expand
Nested Variational Inference
• Computer Science, Mathematics
• ArXiv
• 2021
NVI is developed, a family of methods that learn proposals for nested importance samplers by minimizing an forward or reverse KL divergence at each level of nesting, and it is observed that optimizing nested objectives leads to improved sample quality in terms of log average weight and effective sample size. Expand
Variational Objectives for Markovian Dynamics with Backward Simulation
• Computer Science
• ECAI
• 2020
Particle Smoothing Variational Objectives (SVO) is introduced, a novel backward simulation technique and variational objective constructed from a smoothed approximate posterior that consistently outperforms filtered objectives when given fewer Monte Carlo samples. Expand
Universal probabilistic programming offers a powerful approach to statistical phylogenetics
This work develops automated generation of sequential Monte Carlo algorithms for PPL descriptions of arbitrary biological diversification (birth-death) models, and shows that few hurdles remain before these techniques can be effectively applied to the full range of phylogenetic models. Expand
Ensemble Kalman Variational Objectives: Nonlinear Latent Trajectory Inference with A Hybrid of Variational Inference and Ensemble Kalman Filter
• Computer Science, Mathematics
• ArXiv
• 2020
It is demonstrated that EnKOs outperform the SMC based methods in terms of predictive ability for three benchmark nonlinear dynamics systems tasks and can identify the latent dynamics given fewer particles because of its rich particle diversity. Expand
Exploring Probability Measures with Markov Processes
A transparent characterisation of how one can construct a PDMP (within the class of trajectorially-reversible processes) which admits the desired invariant measure is developed, and actionable recommendations on how these processes should be designed in practice are offered. Expand