On-Line Learning of Linear Dynamical Systems: Exponential Forgetting in Kalman Filters

@article{Kozdoba2019OnLineLO,
  title={On-Line Learning of Linear Dynamical Systems: Exponential Forgetting in Kalman Filters},
  author={Mark Kozdoba and Jakub Marecek and Tigran T. Tchrakian and Shie Mannor},
  journal={ArXiv},
  year={2019},
  volume={abs/1809.05870}
}
The Kalman filter is a key tool for time-series forecasting and analysis. [] Key MethodBased on this insight, we devise an on-line algorithm for improper learning of a linear dynamical system (LDS), which considers only a few most recent observations. We use our decay results to provide the first regret bounds w.r.t. to Kalman filters within learning an LDS. That is, we compare the results of our algorithm to the best, in hindsight, Kalman filter for a given signal. Also, the algorithm is practical: its per…

Figures from this paper

Robust guarantees for learning an autoregressive filter
TLDR
This work takes the approach of directly learning an autoregressive filter for time-series prediction under unknown dynamics, and uses an $L^\infty$-based objective rather than ordinary least-squares for the filter under worst case input.
SLIP: Learning to Predict in Unknown Dynamical Systems with Long-Term Memory
TLDR
An efficient and practical (polynomial time) algorithm for online prediction in unknown and partially observed linear dynamical systems (LDS) under stochastic noise and relies on Mendelson's small-ball method for regret analysis, providing sharp error bounds without concentration, boundedness, or exponential forgetting assumptions.
Online Learning of the Kalman Filter with Logarithmic Regret
TLDR
This work is the first to provide logarithmic regret guarantees for the widely used Kalman filter, and shows that it is possible to achieve a regret of the order of $\mathrm{poly}\log(N)$ with high probability, where $N$ is the number of observations collected.
Sample Complexity of Kalman Filtering for Unknown Systems
TLDR
It is shown that when the system identification step produces sufficiently accurate estimates, or when the underlying true KF is sufficiently robust, that a Certainty Equivalent (CE) KF, i.e., one designed using the estimated parameters directly, enjoys provable sub-optimality guarantees.
Finite Sample Analysis of Stochastic System Identification
TLDR
This analysis uses recent results from random matrix theory, self-normalized martingales and SVD robustness, in order to show that with high probability the estimation errors decrease with a rate of $1/\sqrt N$ up to logarithmic terms.
Variance Estimation For Online Regression via Spectrum Thresholding
TLDR
The global system operator is studied: the operator that maps the noise vectors to the output, and as a result the first known variance estimators with finite sample complexity guarantees are derived.
Improved rates for identification of partially observed linear dynamical systems
TLDR
This work develops the first algorithm that given a single trajectory of length $T$ with gaussian observation noise, achieves a near-optimal rate of $\widetilde O\left(\sqrt\frac{d}{T}\right)$ in $\mathcal{H}_2$ error for the learned system.
Learning Linear Models Using Distributed Iterative Hessian Sketching
TLDR
It is shown that a randomized and distributed Newton algorithm based on Hessian-sketching can produce optimal solutions and converges geometrically and is trivially parallelizable.
Online prediction of time series with assumed behavior
Linear Systems can be Hard to Learn
TLDR
This paper analyzes when system identification is statistically easy or hard, in the finite sample regime, and shows that the sample complexity of robustly controllable linear systems is upper bounded by an exponential function of the controllability index.
...
1
2
...

References

SHOWING 1-10 OF 21 REFERENCES
Kalman Filtering with Real-time Applications
TLDR
Kalman Filtering with Real-Time Applications presents a thorough discussion of the mathematical theory and computational schemes of Kalman filtering, including a direct method consisting of a series of elementary steps, and an indirect method based on innovation projection.
Exponential smoothing: The state of the art
TLDR
A critical review of exponential smoothing since the original work by Brown and Holt in the 1950s is shared, which concludes that the parameter ranges and starting values typically used in practice are arbitrary and may detract from accuracy.
Online ARIMA Algorithms for Time Series Prediction
TLDR
This paper proposes online learning algorithms for estimating ARIMA models under relaxed assumptions on the noise terms, which is suitable to a wider range of applications and enjoys high computational efficiency.
A Unifying Review of Linear Gaussian Models
TLDR
A new model for static data is introduced, known as sensible principal component analysis, as well as a novel concept of spatially adaptive observation noise, which shows how independent component analysis is also a variation of the same basic generative model.
Online Instrumental Variable Regression with Applications to Online Linear System Identification
TLDR
This work develops Online Instrumental Variable Regression (OIVR), an algorithm that is capable of updating the learned estimator with streaming data and demonstrates the efficacy of the algorithm in combination with popular no-regret online algorithms for the task of learning predictive dynamical system models and on a prototypical econometrics instrumental variable regression problem.
On optimal ℓ∞ to ℓ∞ filtering
Online Learning of Linear Dynamical Systems
TLDR
This work presents an efficient and practical algorithm for the online prediction of discrete-time linear dynamical systems that has near-optimal regret bounds compared to the best LDS in hindsight, while overparameterizing by only a small logarithmic factor.
Online Learning for Time Series Prediction
TLDR
This work develops eective online learning algorithms for the prediction problem of predicting a time series using the ARMA (autoregressive moving average) model, without assuming that the noise terms are Gaussian, identically distributed or even independent.
Bayesian Forecasting And Dynamic Models
TLDR
The bayesian forecasting and dynamic models is universally compatible with any devices to read and is available in the book collection an online access to it is set as public so you can get it instantly.
Subspace Identification for Linear Systems: Theory ― Implementation ― Applications
TLDR
This book focuses on the theory, implementation and applications of subspace identification algorithms for linear time-invariant finitedimensional dynamical systems, which allow for a fast, straightforward and accurate determination of linear multivariable models from measured inputoutput data.
...
1
2
3
...