Path storage in the particle filter

@article{Jacob2015PathSI,
  title={Path storage in the particle filter},
  author={Pierre E. Jacob and Lawrence M. Murray and Sylvain Rubenthaler},
  journal={Statistics and Computing},
  year={2015},
  volume={25},
  pages={487-496}
}
This article considers the problem of storing the paths generated by a particle filter and more generally by a sequential Monte Carlo algorithm. It provides a theoretical result bounding the expected memory cost by T+CNlogN where T is the time horizon, N is the number of particles and C is a constant, as well as an efficient algorithm to realise this. The theoretical result and the algorithm are illustrated with numerical experiments. 
Particle Gibbs with refreshed backward simulation
TLDR
This paper shows how a simple modification to this scheme, which is referred to as refreshed backward simulation, can further improve the mixing of Markov chains by sampling new state values simultaneously with the corresponding ancestor indexes.
A Projection-Based Rao-Blackwellized Particle Filter to Estimate Parameters in Conditionally Conjugate State-Space Models
  • Milan Papez
  • Mathematics
    2018 IEEE Statistical Signal Processing Workshop (SSP)
  • 2018
TLDR
This paper proposes a simple and efficient method for online estimation of static parameters under the same framework which is experimentally shown to suffer less from the well-known particle path degeneracy problem.
Particle Filters and Data Assimilation
TLDR
The challenges posed by models with high-dimensional states, joint estimation of parameters and the state, and inference for the history of the state process are discussed, including methods based on the particle filter and the ensemble Kalman filter.
Importance Densities for Particle Filtering Using Iterated Conditional Expectations
TLDR
Simulation results show that the proposed method is based on generalized statistical linear regression and posterior linearization using conditional expectations outperforms the compared methods in terms of the effective sample size and provides a better local approximation of the optimal importance density.
Particle Gibbs with Ancestor Sampling for Probabilistic Programs
TLDR
A formalism to adapt ancestor resampling, a technique that mitigates particle degeneracy, to the probabilistic programming setting is developed and empirical results are presented that demonstrate nontrivial performance gains.
Coupling of Particle Filters
Particle filters provide Monte Carlo approximations of intractable quantities such as point-wise evaluations of the likelihood in state space models. In many scenarios, the interest lies in the
Particle-Based Adaptive-Lag Online Marginal Smoothing in General State-Space Models
We present a novel algorithm, an adaptive-lag smoother, approximating efficiently, in an online fashion, sequences of expectations under the marginal smoothing distributions in general state-space
Numerically stable online estimation of variance in particle filters
This paper discusses variance estimation in sequential Monte Carlo methods, alternatively termed particle filters. The variance estimator that we propose is a natural modification of that suggested
On particle Gibbs sampling
TLDR
This paper presents a coupling construction between two particle Gibbs updates from different starting points and shows that the coupling probability may be made arbitrarily close to one by increasing the number of particles, and extends particle Gibbs to work with lower variance resampling schemes.
Particle-based online estimation of tangent filters with application to parameter estimation in nonlinear state-space models
TLDR
A novel algorithm for efficient online estimation of the filter derivatives in general hidden Markov models is presented, including a central limit theorem with an asymptotic variance that can be shown to be uniformly bounded in time.
...
...

References

SHOWING 1-10 OF 41 REFERENCES
A Tutorial on Particle Filtering and Smoothing: Fifteen years later
TLDR
A complete, up-to-date survey of particle filtering methods as of 2008, including basic and advanced particle methods for filtering as well as smoothing.
Improved particle filter for nonlinear problems
The Kalman filter provides an effective solution to the linear Gaussian filtering problem. However where there is nonlinearity, either in the model specification or the observation process, other
Particle approximations of the score and observed information matrix in state space models with application to parameter estimation
TLDR
This work presents two particle algorithms to compute the score vector and observed information matrix recursively in nonlinear non-Gaussian state space models and shows how both methods can be used to perform batch and recursive parameter estimation.
On the Particle Gibbs Sampler
The particle Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm which operates on the extended space of the auxiliary variables generated by an interacting particle system. In particular,
Rethinking resampling in the particle filter on graphics processing units
TLDR
This work presents and compares a number of resampling algorithms, including rarelyused alternatives based on Metropolis and rejection sampling, and finds that these alternative approaches perform significantly faster on the GPU than more common approaches such as the multinomial, stratified and systematic resamplers.
SMC2: an efficient algorithm for sequential analysis of state space models
TLDR
The SMC2 algorithm is proposed, a sequential Monte Carlo algorithm, defined in the θ‐dimension, which propagates and resamples many particle filters in the x‐ dimension, which explores the applicability of the algorithm in both sequential and non‐sequential applications and considers various degrees of freedom.
Ancestor Sampling for Particle Gibbs
TLDR
This work presents a novel method in the family of particle MCMC methods that it refers to as particle Gibbs with ancestor sampling (PG-AS), and develops a truncation strategy of these models that is applicable in principle to any backward-simulation-based method, but which is particularly well suited to the PG-AS framework.
Novel approach to nonlinear/non-Gaussian Bayesian state estimation
TLDR
An algorithm, the bootstrap filter, is proposed for implementing recursive Bayesian filters, represented as a set of random samples, which are updated and propagated by the algorithm.
Long-term stability of sequential Monte Carlo methods under verifiable conditions
This paper discusses particle filtering in general hidden Markov models (HMMs) and presents novel theoretical results on the long-term stability of bootstrap-type particle filters. More specifically,
Stability properties of some particle filters
Under multiplicative drift and other regularity conditions, it is established that the asymptotic variance associated with a particle filter approximation of the prediction filter is bounded
...
...