# The Neural Particle Filter

@inproceedings{Kutschireiter2015TheNP, title={The Neural Particle Filter}, author={Anna Kutschireiter and Simone Carlo Surace and Henning Sprekeler and Jean-Pascal Pfister}, year={2015} }

The robust estimation of dynamically changing features, such as the position of prey, is one of the hallmarks of perception. On an abstract, algorithmic level, nonlinear Bayesian filtering, i.e. the estimation of temporally changing signals based on the history of observations, provides a mathematical framework for dynamic perception in real time. Since the general, nonlinear filtering problem is analytically intractable, particle filters are considered among the most powerful approaches to…

## 3 Citations

Neural Kalman Filtering

- Computer ScienceArXiv
- 2021

It is shown that a gradient-descent approximation to the Kalman filter requires only local computations with variance weighted prediction errors, and that it is possible under the same scheme to adaptively learn the dynamics model with a learning rule that corresponds directly to Hebbian plasticity.

Accelerated Physical Emulation of Bayesian Inference in Spiking Neural Networks

- Computer ScienceFront. Neurosci.
- 2019

This work presents a spiking network model that performs Bayesian inference through sampling on the BrainScaleS neuromorphic platform, where it is used for generative and discriminative computations on visual data and implicitly demonstrates its robustness to various substrate-specific distortive effects.

A neurally plausible model for online recognition and postdiction

- BiologybioRxiv
- 2019

A general framework for neural probabilistic inference in dynamic models based on the distributed distributional code (DDC) representation of uncertainty is proposed, naturally extending the underlying encoding to incorporate implicit Probabilistic beliefs about both present and past.

## References

SHOWING 1-10 OF 45 REFERENCES

A Neural Implementation of the Kalman Filter

- Computer ScienceNIPS
- 2009

This paper focuses on the Bayesian filtering of stochastic time series and introduces a novel neural network, derived from a line attractor architecture, whose dynamics map directly onto those of the Kalman filter in the limit of small prediction error.

Implementing a Bayes Filter in a Neural Circuit: The Case of Unknown Stimulus Dynamics

- Computer ScienceNeural Computation
- 2017

This note presents a method for training a theoretical neural circuit to approximately implement a Bayes filter when the stimulus dynamics are unknown and performs stochastic gradient descent on the negative log-likelihood of the neural network parameters with a novel approximation of the gradient.

Fast Sampling-Based Inference in Balanced Neuronal Networks

- Computer ScienceNIPS
- 2014

This work shows analytically and through simulations that the symmetry of the synaptic weight matrix implied by LS yields critically slow mixing when the posterior is high-dimensional, and constructs and inspect networks that are optimally fast, and hence orders of magnitude faster than LS, while being far more biologically plausible.

Bayesian Inference and Online Learning in Poisson Neuronal Networks

- Computer Science, BiologyNeural Computation
- 2016

This work shows how a two-layer recurrent network of Poisson neurons can perform both approximate Bayesian inference and learning for any hidden Markov model, and demonstrates how the network can learn the likelihood model, as well as the transition probabilities underlying the dynamics, using a Hebbian learning rule.

Optimal Sensorimotor Integration in Recurrent Cortical Networks: A Neural Implementation of Kalman Filters

- Biology, Computer ScienceThe Journal of Neuroscience
- 2007

It is proposed that the neural implementation of this Kalman filter involves recurrent basis function networks with attractor dynamics, a kind of architecture that can be readily mapped onto cortical circuits.

Ensembles of Spiking Neurons with Noise Support Optimal Probabilistic Inference in a Dynamically Changing Environment

- Computer SciencePLoS Comput. Biol.
- 2014

The viability of this new approach towards neural coding and computation, which makes use of the inherent parallelism of generic neural circuits, is demonstrated by showing that this model can explain experimentally observed firing activity of cortical neurons for a variety of tasks that require rapid temporal integration of sensory information.

Exact Inferences in a Neural Implementation of a Hidden Markov Model

- MathematicsNeural Computation
- 2007

From first principles, we derive a quadratic nonlinear, first-order dynamical system capable of performing exact Bayes-Markov inferences for a wide class of biologically plausible stimulus-dependent…

A Tutorial on Particle Filtering and Smoothing: Fifteen years later

- Computer Science
- 2008

A complete, up-to-date survey of particle filtering methods as of 2008, including basic and advanced particle methods for filtering as well as smoothing.

Curse of dimensionality and particle filters

- Computer Science2003 IEEE Aerospace Conference Proceedings (Cat. No.03TH8652)
- 2003

A simple back-of-the-envelope formula is derived that explains why a carefully designed PF should mitigate the curse of dimensionality for certain filtering problems, but the PF does not avoid the curseof dimensionality in general.

Recursive neural filters and dynamical range transformers

- Computer ScienceProceedings of the IEEE
- 2004

This paper reviews and presents new results on recursive neural filters through introducing those based on a multilayer perceptron with output feedbacks and proposing dynamical range reducers and extenders for applications where the range of the measurement or signal process expands in time or is too large for the recurrent neural network to handle for the filtering resolution required.