Wasserstein-Splitting Gaussian Process Regression for Heterogeneous Online Bayesian Inference

@article{Kepler2021WassersteinSplittingGP,
  title={Wasserstein-Splitting Gaussian Process Regression for Heterogeneous Online Bayesian Inference},
  author={Michael E. Kepler and Alec Koppel and A. S. Bedi and Daniel J. Stilwell},
  journal={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
  year={2021},
  pages={9833-9840}
}
Gaussian processes (GPs) are a well-known nonparametric Bayesian inference technique, but they suffer from scalability problems for large sample sizes, and their performance can degrade for non-stationary or spatially heterogeneous data. In this work, we seek to overcome these issues through (i) employing variational free energy approximations of GPs operating in tandem with online expectation propagation steps; and (ii) introducing a local splitting step which instantiates a new GP whenever… 
1 Citations

Figures and Tables from this paper

Ensemble Gaussian Processes for Online Learning Over Graphs With Adaptivity and Scalability

This work capitalizes on Gaussian processes (GPs) to offer a Bayesian SSL approach over graphs with uncertainty quantification, a key attribute especially in safety-critical domains, to allow learning with scalability and preserve privacy.

References

SHOWING 1-10 OF 44 REFERENCES

Sparse Gaussian Processes using Pseudo-inputs

It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.

Consistent Online Gaussian Process Regression Without the Sample Complexity Bottleneck

  • Alec Koppel
  • Computer Science
    2019 American Control Conference (ACC)
  • 2019
This work develops the first compression sub-routine for online Gaussian processes that preserves their convergence to the population posterior, i.e., asymptotic posterior consistency, while ameliorating their intractable complexity growth with the sample size.

Gaussian processes:iterative sparse approximations

This thesis proposes a two-step solution to construct a probabilistic approximation to the posterior of Gaussian processes, and combines the sparse approximation with an extension to the Bayesian online algorithm that allows multiple iterations for each input and thus approximating a batch solution.

Drifting Gaussian processes with varying neighborhood sizes for online model learning

This work investigates the idea of drifting Gaussian processes which explicitly exploit the fact that data is generated along trajectories and proposes to use several - in parallel - drifting GPs whose predictions are combined for query points.

Sparse Orthogonal Variational Inference for Gaussian Processes

A new interpretation of sparse variational approximations for Gaussian processes using inducing points is introduced, which can lead to more scalable algorithms than previous methods and report state-of-the-art results on CIFAR-10 among purely GP-based models.

When Gaussian Process Meets Big Data: A Review of Scalable GPs

This article is devoted to reviewing state-of-the-art scalable GPs involving two main categories: global approximations that distillate the entire data and local approximation that divide the data for subspace learning.

Incremental Local Gaussian Regression

A path from Gaussian (process) regression to locally weighted regression, where the best of both approaches are retained, is suggested, which consistently achieves on par or superior performance compared to current state-of-the-art methods.

Fast Forward Selection to Speed Up Sparse Gaussian Process Regression

A method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection, which leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically.

Dense Incremental Metric-Semantic Mapping via Sparse Gaussian Process Regression

An online Gaussian Process training and inference approach, which avoids the complexity of GP classification by regressing a truncated signed distance function representation of the regions occupied by different semantic classes.

Sparse Gaussian Processes for Bayesian Optimization

This work introduces a new updating scheme for the online GP that accounts for the authors' preference during optimization for regions with better performance, and applies this method to optimize the performance of a free-electron laser.