Closed-form Inference and Prediction in Gaussian Process State-Space Models
@article{Ialongo2018ClosedformIA, title={Closed-form Inference and Prediction in Gaussian Process State-Space Models}, author={Alessandro Davide Ialongo and Mark van der Wilk and Carl Edward Rasmussen}, journal={ArXiv}, year={2018}, volume={abs/1812.03580} }
We examine an analytic variational inference scheme for the Gaussian Process State Space Model (GPSSM) - a probabilistic model for system identification and time-series modelling. Our approach performs variational inference over both the system states and the transition function. We exploit Markov structure in the true posterior, as well as an inducing point approximation to achieve linear time complexity in the length of the time series. Contrary to previous approaches, no Monte Carlo sampling…Â
8 Citations
Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models
- Computer ScienceICML
- 2019
A new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process, which gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods.
Active Learning in Gaussian Process State Space Model
- Computer ScienceECML/PKDD
- 2021
The problem is to actively steer the system through latent states by determining its inputs such that the underlying dynamics can be optimally learned by a Gaussian Process statespace models given latent states.
Non-Factorised Variational Inference in Dynamical Systems
- Computer ScienceArXiv
- 2018
This work focuses on variational inference in dynamical systems where the discrete time transition function (or evolution rule) is modelled by a Gaussian process and proposes a new method that addresses these issues and incurs no additional computational costs.
Learning While Tracking: A Practical System Based on Variational Gaussian Process State-Space Model and Smartphone Sensory Data
- Computer Science2020 IEEE 23rd International Conference on Information Fusion (FUSION)
- 2020
Experimental results obtained from a real office environment validate the outstanding performance of the variational GPSSM in comparison with the traditional parametric state-space model in terms of tracking accuracy.
Bayesian Hidden Physics Models: Uncertainty Quantification for Discovery of Nonlinear Partial Differential Operators from Data
- Computer ScienceArXiv
- 2020
This work introduces a novel model comprising "leaf" modules that learn to represent distinct experiments' spatiotemporal functional data as neural networks and a single "root" module that expresses a nonparametric distribution over their governing nonlinear differential operator as a Gaussian process.
A Real Indoor Navigation System Based on Variational Gaussian Process State-Space Model and Smartphone Sensory Data
- Computer Science
- 2019
This work implements a wireless indoor navigation system based on the variational Gaussian process state-space model (GPSSM) with low quality sensory data collected by smartphone, and adapts the existing variational GPSSM framework to practical wireless navigation scenarios.
FedLoc: Federated Learning Framework for Data-Driven Cooperative Localization and Location Data Processing
- Computer ScienceIEEE Open Journal of Signal Processing
- 2020
Experimental results show that near centralized data fitting- and prediction performance can be achieved by a set of collaborative mobile users running distributed algorithms.
FedLoc: Federated Learning Framework for Cooperative Localization and Location Data Processing
- Computer ScienceArXiv
- 2020
The obtained primary results confirm that the proposed FedLoc framework well suits data-driven, machine learning-based localization and spatio-temporal data modeling.
References
SHOWING 1-6 OF 6 REFERENCES
Variational Gaussian Process State-Space Models
- Computer ScienceNIPS
- 2014
This work presents a procedure for efficient variational Bayesian learning of nonlinear state-space models based on sparse Gaussian processes and offers the possibility to straightforwardly trade off model capacity and computational cost whilst avoiding overfitting.
Gaussian Process Priors with Uncertain Inputs - Application to Multiple-Step Ahead Time Series Forecasting
- MathematicsNIPS
- 2002
This paper shows how an analytical Gaussian approximation can formally incorporate the uncertainty about intermediate regressor values, thus updating the uncertainty on the current prediction of the multi-step ahead prediction in time series analysis.
On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes
- Computer ScienceAISTATS
- 2016
A substantial generalization of the literature on variational framework for learning inducing variables is given and a new proof of the result for infinite index sets is given which allows inducing points that are not data points and likelihoods that depend on all function values.
PILCO: A Model-Based and Data-Efficient Approach to Policy Search
- Computer ScienceICML
- 2011
PILCO reduces model bias, one of the key problems of model-based reinforcement learning, in a principled way by learning a probabilistic dynamics model and explicitly incorporating model uncertainty into long-term planning.
Variational Learning of Inducing Variables in Sparse Gaussian Processes
- Computer ScienceAISTATS
- 2009
A variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood.
Nonlinear modelling and control using Gaussian processes
- PhD thesis, PhD thesis,
- 2014