• Corpus ID: 235415197

Combining Pseudo-Point and State Space Approximations for Sum-Separable Gaussian Processes

@inproceedings{Tebbutt2021CombiningPA,
  title={Combining Pseudo-Point and State Space Approximations for Sum-Separable Gaussian Processes},
  author={Will Tebbutt and A. Solin and Richard E. Turner},
  booktitle={UAI},
  year={2021}
}
Gaussian processes (GPs) are important probab-ilistic tools for inference and learning in spatio-temporal modelling problems such as those in climate science and epidemiology. However, existing GP approximations do not simultaneously support large numbers of off-the-grid spatial data-points and long time-series which is a hallmark of many applications. Pseudo-point approximations, one of the gold-standard methods for scaling GPs to large data sets, are well suited for handling off-the-grid… 

Figures from this paper

Sparse Algorithms for Markovian Gaussian Processes
TLDR
This work derives a general site-based approach to approximate inference, whereby the non-Gaussian likelihood is approximate with local Gaussian terms, called sites, and results in a suite of novel sparse extensions to algorithms from both the machine learning and signal processing literature, including variational inference, expectation propagation, and the classical nonlinear Kalman smoothers.
Spatio-Temporal Variational Gaussian Processes
TLDR
A sparse approximation is derived that constructs a state-space model over a reduced set of spatial inducing points, and it is shown that for separable Markov kernels the full and sparse cases exactly recover the standard variational GP, whilst exhibiting favourable computational properties.
Bayes-Newton Methods for Approximate Bayesian Inference with PSD Guarantees
TLDR
This work forms natural gradient variational inference, expectation propagation, and posterior linearisation as extensions of Newton’s method for optimising the parameters of a Bayesian posterior distribution under the framework of numerical optimisation, and provides new insights into the connections between various inference schemes.

References

SHOWING 1-10 OF 57 REFERENCES
General linear-time inference for Gaussian Processes on one dimension
TLDR
It is proved that for data sampled on one dimension, approximate GP inference at any desired level of accuracy requires computational effort that scales linearly with the number of observations; this new theorem enables inference on much larger datasets than was previously feasible.
Tree-structured Gaussian Process Approximations
TLDR
This paper devise an approximation whose complexity grows linearly with the number of pseudo-datapoints and calibrating the approximation using a Kullback-Leibler (KL) minimization, and demonstrates the validity of this approach on a set of challenging regression tasks including missing data imputation for audio and spatial datasets.
A Unifying Framework for Gaussian Process Pseudo-Point Approximations using Power Expectation Propagation
TLDR
This paper develops a new pseudo-point approximation framework using Power Expectation Propagation (Power EP) that unifies a large number of these pseudo- point approximations and demonstrates that the new framework includes new Pseudo- point approximation methods that outperform current approaches on regression and classification tasks.
Fast Variational Learning in State-Space Gaussian Process Models
TLDR
This paper provides an efficient JAX implementation which exploits just-in-time compilation and allows for fast automatic differentiation through large for-loops and leads to fast and stable variational inference in state-space GP models that can be scaled to time series with millions of data points.
Sparse Spatio-temporal Gaussian Processes with General Likelihoods
TLDR
This paper considers learning of spatio-temporal processes by formulating a Gaussian process model as a solution to an evolution type stochastic partial differential equation by discretizing the process spatially.
An explicit link between Gaussian fields and Gaussian Markov random fields: the stochastic partial differential equation approach
TLDR
It is shown that, using an approximate stochastic weak solution to (linear) stochastically partial differential equations, some Gaussian fields in the Matérn class can provide an explicit link, for any triangulation of , between GFs and GMRFs, formulated as a basis function representation.
Sparse Gaussian Process Variational Autoencoders
TLDR
The development of the sparse Gaussian process variational autoencoder (SGP-VAE), characterised by the use of partial inference networks for parameterising sparse GP approximations, enables inference in multi-output sparse GPs on previously unobserved data with no additional training.
State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes
TLDR
This work forms approximate Bayesian inference in non-conjugate temporal and spatio-temporal Gaussian process models as a simple parameter update rule applied during Kalman smoothing, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework.
Inter-domain Gaussian Processes for Sparse Inference using Inducing Features
TLDR
A general inference framework for inter-domain Gaussian Processes (GPs) is presented and it is shown how previously existing models fit into this framework and will be used to develop two new sparse GP models.
Variational Fourier Features for Gaussian Processes
TLDR
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.
...
...