• Corpus ID: 54462057

Closed-form Inference and Prediction in Gaussian Process State-Space Models

  title={Closed-form Inference and Prediction in Gaussian Process State-Space Models},
  author={Alessandro Davide Ialongo and Mark van der Wilk and Carl Edward Rasmussen},
We examine an analytic variational inference scheme for the Gaussian Process State Space Model (GPSSM) - a probabilistic model for system identification and time-series modelling. Our approach performs variational inference over both the system states and the transition function. We exploit Markov structure in the true posterior, as well as an inducing point approximation to achieve linear time complexity in the length of the time series. Contrary to previous approaches, no Monte Carlo sampling… 

Figures from this paper

Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

A new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process, which gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods.

Active Learning in Gaussian Process State Space Model

The problem is to actively steer the system through latent states by determining its inputs such that the underlying dynamics can be optimally learned by a Gaussian Process statespace models given latent states.

Non-Factorised Variational Inference in Dynamical Systems

This work focuses on variational inference in dynamical systems where the discrete time transition function (or evolution rule) is modelled by a Gaussian process and proposes a new method that addresses these issues and incurs no additional computational costs.

Learning While Tracking: A Practical System Based on Variational Gaussian Process State-Space Model and Smartphone Sensory Data

Experimental results obtained from a real office environment validate the outstanding performance of the variational GPSSM in comparison with the traditional parametric state-space model in terms of tracking accuracy.

Bayesian Hidden Physics Models: Uncertainty Quantification for Discovery of Nonlinear Partial Differential Operators from Data

This work introduces a novel model comprising "leaf" modules that learn to represent distinct experiments' spatiotemporal functional data as neural networks and a single "root" module that expresses a nonparametric distribution over their governing nonlinear differential operator as a Gaussian process.

A Real Indoor Navigation System Based on Variational Gaussian Process State-Space Model and Smartphone Sensory Data

This work implements a wireless indoor navigation system based on the variational Gaussian process state-space model (GPSSM) with low quality sensory data collected by smartphone, and adapts the existing variational GPSSM framework to practical wireless navigation scenarios.

FedLoc: Federated Learning Framework for Data-Driven Cooperative Localization and Location Data Processing

Experimental results show that near centralized data fitting- and prediction performance can be achieved by a set of collaborative mobile users running distributed algorithms.

FedLoc: Federated Learning Framework for Cooperative Localization and Location Data Processing

The obtained primary results confirm that the proposed FedLoc framework well suits data-driven, machine learning-based localization and spatio-temporal data modeling.



Variational Gaussian Process State-Space Models

This work presents a procedure for efficient variational Bayesian learning of nonlinear state-space models based on sparse Gaussian processes and offers the possibility to straightforwardly trade off model capacity and computational cost whilst avoiding overfitting.

Gaussian Process Priors with Uncertain Inputs - Application to Multiple-Step Ahead Time Series Forecasting

This paper shows how an analytical Gaussian approximation can formally incorporate the uncertainty about intermediate regressor values, thus updating the uncertainty on the current prediction of the multi-step ahead prediction in time series analysis.

On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes

A substantial generalization of the literature on variational framework for learning inducing variables is given and a new proof of the result for infinite index sets is given which allows inducing points that are not data points and likelihoods that depend on all function values.

PILCO: A Model-Based and Data-Efficient Approach to Policy Search

PILCO reduces model bias, one of the key problems of model-based reinforcement learning, in a principled way by learning a probabilistic dynamics model and explicitly incorporating model uncertainty into long-term planning.

Variational Learning of Inducing Variables in Sparse Gaussian Processes

A variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood.

Nonlinear modelling and control using Gaussian processes

  • PhD thesis, PhD thesis,
  • 2014