The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Nongaussian Observation Models

  title={The Discriminative Kalman Filter for Bayesian Filtering with Nonlinear and Nongaussian Observation Models},
  author={Michael C. Burkhart and David M. Brandman and Brian Franco and Leigh R. Hochberg and Matthew T. Harrison},
  journal={Neural Computation},
The Kalman filter provides a simple and efficient algorithm to compute the posterior distribution for state-space models where both the latent state and measurement models are linear and gaussian. Extensions to the Kalman filter, including the extended and unscented Kalman filters, incorporate linearizations for models where the observation model p(observation|state) is nonlinear. We argue that in many cases, a model for p(state|observation) proves both easier to learn and more accurate for… 
Discriminative Bayesian Filtering Lends Momentum to the Stochastic Newton Method for Minimizing Log-Convex Functions
This work establishes matrix-based conditions under which the effect of older observations diminishes over time, in a manner analogous to Polyak’s heavy ball momentum, and develops a novel optimization algorithm that considers the entire history of gradients and Hessians when forming an update.
Bayesian Decoder Models with a Discriminative Observation Process
This work builds a discriminative model to estimate state processes as a function of current and previous observations of neural activity and demonstrates how a dynamical auto-encoder can be built using the direct decoder model; here, the underlying state process links the high-dimensional neural activity to the behavioral readout.
Direct Discriminative Decoder Models for Analysis of High-Dimensional Dynamical Neural Data
A novel, scalable latent process model that can directly estimate cognitive process dynamics without requiring precise receptive field models of individual neurons or brain nodes is proposed, and an extension of these methods is introduced, called the discriminative-generative decoder (DGD).
Deep Discriminative Direct Decoders for High-dimensional Time-series Analysis
It is proposed how DNN parameters along with an optimal history term can be simultaneously estimated as a part of the DDD model, and showed the D4 decoding performance in both simulation and (relatively) high-dimensional neural data.
Learning active tactile perception through belief-space control
This work proposes a method for autonomously learning active tactile perception policies, by learning a generative world model leveraging a differentiable bayesian bayesian algorithm, and designing an information-gathering model predictive controller.


A Discriminative Approach to Bayesian Filtering with Applications to Human Neural Decoding
It is argued there are many cases where the distribution of state given measurement is better-approximated as Gaussian, especially when the dimensionality of measurements far exceeds that of states and the Bernstein—von Mises theorem applies.
Sigma-point kalman filters for probabilistic inference in dynamic state-space models
This work has consistently shown that there are large performance benefits to be gained by applying Sigma-Point Kalman filters to areas where EKFs have been used as the de facto standard in the past, as well as in new areas where the use of the EKF is impossible.
The unscented Kalman filter for nonlinear estimation
  • E. Wan, R. Van Der Merwe
  • Mathematics
    Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium (Cat. No.00EX373)
  • 2000
This paper points out the flaws in using the extended Kalman filter (EKE) and introduces an improvement, the unscented Kalman filter (UKF), proposed by Julier and Uhlman (1997). A central and vital
The Kalman Laplace filter: A new deterministic algorithm for nonlinear Bayesian filtering
A new recursive algorithm for nonlinear Bayesian filtering, where the prediction step is performed like in the extended Kalman filter, and the update step is done thanks to the Laplace method for integral approximation, called the Kalman Laplace filter (KLF).
New extension of the Kalman filter to nonlinear systems
It is argued that the ease of implementation and more accurate estimation features of the new filter recommend its use over the EKF in virtually all applications.
Approximate Methods for State-Space Models
A nonlinear filter for nonlinear/non-Gaussian state-space models, which uses Laplace's method, an asymptotic series expansion, to approximate the state’s conditional mean and variance, together with a Gaussian conditional distribution, and is stable over time.
A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking
Both optimal and suboptimal Bayesian algorithms for nonlinear/non-Gaussian tracking problems, with a focus on particle filters are reviewed.
Bayesian Filtering and Smoothing
  • S. Särkkä
  • Computer Science
    Institute of Mathematical Statistics textbooks
  • 2013
This compact, informal introduction for graduate students and advanced undergraduates presents the current state-of-the-art filtering and smoothing methods in a unified Bayesian framework and learns what non-linear Kalman filters and particle filters are, how they are related, and their relative advantages and disadvantages.
Adaptive sampling with the ensemble transform Kalman filter
The ET KF technique is used by the National Centers for Environmental Prediction in the Winter Storm Reconnaissance missions of 1999 and 2000 to determine where aircraft should deploy dropwindsondes in order to improve 24‐72-h forecasts over the continental United States.
Discrete-Time Nonlinear Filtering Algorithms Using Gauss–Hermite Quadrature
A new version of the quadrature Kalman filter (QKF) is developed theoretically and tested experimentally and exhibits a significant improvement over other nonlinear filtering approaches, namely, the basic bootstrap (particle) filters and Gaussian-sum extended Kalman filters, to solve nonlinear non- Gaussian filtering problems.