The Neural Particle Filter

@inproceedings{Kutschireiter2015TheNP,
  title={The Neural Particle Filter},
  author={Anna Kutschireiter and Simone Carlo Surace and Henning Sprekeler and Jean-Pascal Pfister},
  year={2015}
}
The robust estimation of dynamically changing features, such as the position of prey, is one of the hallmarks of perception. On an abstract, algorithmic level, nonlinear Bayesian filtering, i.e. the estimation of temporally changing signals based on the history of observations, provides a mathematical framework for dynamic perception in real time. Since the general, nonlinear filtering problem is analytically intractable, particle filters are considered among the most powerful approaches to… 
Neural Kalman Filtering
TLDR
It is shown that a gradient-descent approximation to the Kalman filter requires only local computations with variance weighted prediction errors, and that it is possible under the same scheme to adaptively learn the dynamics model with a learning rule that corresponds directly to Hebbian plasticity.
Accelerated Physical Emulation of Bayesian Inference in Spiking Neural Networks
TLDR
This work presents a spiking network model that performs Bayesian inference through sampling on the BrainScaleS neuromorphic platform, where it is used for generative and discriminative computations on visual data and implicitly demonstrates its robustness to various substrate-specific distortive effects.
A neurally plausible model for online recognition and postdiction
TLDR
A general framework for neural probabilistic inference in dynamic models based on the distributed distributional code (DDC) representation of uncertainty is proposed, naturally extending the underlying encoding to incorporate implicit Probabilistic beliefs about both present and past.

References

SHOWING 1-10 OF 45 REFERENCES
A Neural Implementation of the Kalman Filter
TLDR
This paper focuses on the Bayesian filtering of stochastic time series and introduces a novel neural network, derived from a line attractor architecture, whose dynamics map directly onto those of the Kalman filter in the limit of small prediction error.
Implementing a Bayes Filter in a Neural Circuit: The Case of Unknown Stimulus Dynamics
TLDR
This note presents a method for training a theoretical neural circuit to approximately implement a Bayes filter when the stimulus dynamics are unknown and performs stochastic gradient descent on the negative log-likelihood of the neural network parameters with a novel approximation of the gradient.
Fast Sampling-Based Inference in Balanced Neuronal Networks
TLDR
This work shows analytically and through simulations that the symmetry of the synaptic weight matrix implied by LS yields critically slow mixing when the posterior is high-dimensional, and constructs and inspect networks that are optimally fast, and hence orders of magnitude faster than LS, while being far more biologically plausible.
Bayesian Inference and Online Learning in Poisson Neuronal Networks
TLDR
This work shows how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model, and demonstrates how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule.
Optimal Sensorimotor Integration in Recurrent Cortical Networks: A Neural Implementation of Kalman Filters
TLDR
It is proposed that the neural implementation of this Kalman filter involves recurrent basis function networks with attractor dynamics, a kind of architecture that can be readily mapped onto cortical circuits.
Ensembles of Spiking Neurons with Noise Support Optimal Probabilistic Inference in a Dynamically Changing Environment
TLDR
The viability of this new approach towards neural coding and computation, which makes use of the inherent parallelism of generic neural circuits, is demonstrated by showing that this model can explain experimentally observed firing activity of cortical neurons for a variety of tasks that require rapid temporal integration of sensory information.
Exact Inferences in a Neural Implementation of a Hidden Markov Model
From first principles, we derive a quadratic nonlinear, first-order dynamical system capable of performing exact Bayes-Markov inferences for a wide class of biologically plausible stimulus-dependent
A Tutorial on Particle Filtering and Smoothing: Fifteen years later
TLDR
A complete, up-to-date survey of particle filtering methods as of 2008, including basic and advanced particle methods for filtering as well as smoothing.
Curse of dimensionality and particle filters
  • F. Daum, J. Huang
  • Computer Science
    2003 IEEE Aerospace Conference Proceedings (Cat. No.03TH8652)
  • 2003
TLDR
A simple back-of-the-envelope formula is derived that explains why a carefully designed PF should mitigate the curse of dimensionality for certain filtering problems, but the PF does not avoid the curseof dimensionality in general.
Recursive neural filters and dynamical range transformers
  • J. Lo, Lei Yu
  • Computer Science
    Proceedings of the IEEE
  • 2004
TLDR
This paper reviews and presents new results on recursive neural filters through introducing those based on a multilayer perceptron with output feedbacks and proposing dynamical range reducers and extenders for applications where the range of the measurement or signal process expands in time or is too large for the recurrent neural network to handle for the filtering resolution required.
...
...