• Corpus ID: 232290829

Sparse Algorithms for Markovian Gaussian Processes

@article{Wilkinson2021SparseAF,
  title={Sparse Algorithms for Markovian Gaussian Processes},
  author={William J. Wilkinson and A. Solin and Vincent Adam},
  journal={ArXiv},
  year={2021},
  volume={abs/2103.10710}
}
Approximate Bayesian inference methods that scale to very large datasets are crucial in leveraging probabilistic models for real-world time series. Sparse Markovian Gaussian processes combine the use of inducing variables with efficient Kalman filter-like recursions, resulting in algorithms whose computational and memory requirements scale linearly in the number of inducing points, whilst also enabling parallel parameter updates and stochastic optimisation. Under this paradigm, we derive a… 

Figures and Tables from this paper

Dual Parameterization of Sparse Variational Gaussian Processes
TLDR
A dual parameterization where each data example is assigned dual parameters, similarly to site parameters used in expectation propagation, speeds-up inference using natural gradient descent, and provides a tighter evidence lower bound for hyperparameter learning.
Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees
TLDR
This work forms natural gradient variational inference, expectation propagation, and posterior linearisation as extensions of Newton’s method for optimising the parameters of a Bayesian posterior distribution under the framework of numerical optimisation, and provides new insights into the connections between various inference schemes.
Spatio-Temporal Variational Gaussian Processes
TLDR
A sparse approximation is derived that constructs a state-space model over a reduced set of spatial inducing points, and it is shown that for separable Markov kernels the full and sparse cases exactly recover the standard variational GP, whilst exhibiting favourable computational properties.
Deep State-Space Gaussian Processes
TLDR
The proposed models and methods form a state-space MAP as well as Bayesian filtering and smoothing solutions to the DGP regression problem and are applied to detection of the gravitational waves from LIGO measurements.

References

SHOWING 1-10 OF 47 REFERENCES
State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes
TLDR
This work forms approximate Bayesian inference in non-conjugate temporal and spatio-temporal Gaussian process models as a simple parameter update rule applied during Kalman smoothing, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework.
Fast Variational Learning in State-Space Gaussian Process Models
TLDR
This paper provides an efficient JAX implementation which exploits just-in-time compilation and allows for fast automatic differentiation through large for-loops and leads to fast and stable variational inference in state-space GP models that can be scaled to time series with millions of data points.
Spatio-Temporal Learning via Infinite-Dimensional Bayesian Filtering and Smoothing
TLDR
Methods for converting spatio-temporal Gaussian process regression and classification problems into infinite-dimensional state space models and the use of machine learning models in signal processing becomes computationally feasible, and it opens the possibility to combine machine learning techniques with signal processing methods.
Combining Pseudo-Point and State Space Approximations for Sum-Separable Gaussian Processes
TLDR
This work shows that there is a simple and elegant way to combine pseudo-point methods with the state space GP approximation framework to get the best of both worlds, and demonstrates empirically that the combined approach is more scalable and applicable to a greater range of spatio-temporal problems than either method on its own.
Variational Fourier Features for Gaussian Processes
TLDR
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.
Sparse Gaussian Processes using Pseudo-inputs
TLDR
It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.
State Space Gaussian Processes with Non-Gaussian Likelihood
TLDR
A comprehensive overview and tooling for GP modeling with non-Gaussian likelihoods using state space methods and means of combining the efficient $\mathcal{O}(n)$ state space methodology with existing inference methods are presented.
Tree-structured Gaussian Process Approximations
TLDR
This paper devise an approximation whose complexity grows linearly with the number of pseudo-datapoints and calibrating the approximation using a Kullback-Leibler (KL) minimization, and demonstrates the validity of this approach on a set of challenging regression tasks including missing data imputation for audio and spatial datasets.
Stochastic Expectation Propagation
TLDR
Stochastic expectation propagation is presented, called SEP, that maintains a global posterior approximation but updates it in a local way (like EP), and is ideally suited to performing approximate Bayesian learning in the large model, large dataset setting.
A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation
TLDR
This paper develops a new pseudo-point approximation framework using Power Expectation Propagation (Power EP) that unifies a large number of these pseudo- point approximations and demonstrates that the new framework includes new Pseudo- point approximation methods that outperform current approaches on regression and classification tasks.
...
...