# Sparse On-Line Gaussian Processes

@article{Csat2002SparseOG, title={Sparse On-Line Gaussian Processes}, author={L. Csat{\'o} and M. Opper}, journal={Neural Computation}, year={2002}, volume={14}, pages={641-668} }

We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the prediction of the GP model. By using an appealing parameterization and projection techniques in a reproducing kernel Hilbert space… Expand

#### Topics from this paper

#### 674 Citations

Fast Sparse Gaussian Process Methods: The Informative Vector Machine

- Computer Science
- NIPS
- 2002

A framework for sparse Gaussian process (GP) methods which uses forward selection with criteria based on information-theoretic principles, which allows for Bayesian model selection and is less complex in implementation is presented. Expand

Efficient Nonparametric Bayesian Modelling with Sparse Gaussian Process Approximations

- Mathematics
- 2006

Sparse approximations to Bayesian inference for nonparametric Gaussian Process models scale linearly in the number of training points, allowing for the application of powerful kernel-based models to… Expand

Sparse Orthogonal Variational Inference for Gaussian Processes

- Computer Science, Mathematics
- AISTATS
- 2020

A new interpretation of sparse variational approximations for Gaussian processes using inducing points is introduced, which can lead to more scalable algorithms than previous methods and report state-of-the-art results on CIFAR-10 among purely GP-based models. Expand

MCMC for Variationally Sparse Gaussian Processes

- Computer Science, Mathematics
- NIPS
- 2015

A Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Expand

Fast Forward Selection to Speed Up Sparse Gaussian Process Regression

- Computer Science
- AISTATS
- 2003

A method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection, which leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically. Expand

Variational Fourier Features for Gaussian Processes

- Computer Science, Mathematics
- J. Mach. Learn. Res.
- 2017

This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures. Expand

Sequential, Sparse Learning in Gaussian Processes

The application of Gaussian processes (or Gaussian random field models in the spatial context) has historically been limited to datasets of a small size. This limitation is imposed by the requirement… Expand

Sparse Spectrum Gaussian Process Regression

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2010

The achievable trade-offs between predictive accuracy and computational requirements are compared, and it is shown that these are typically superior to existing state-of-the-art sparse approximations. Expand

Local and global sparse Gaussian process approximations

- Computer Science
- AISTATS
- 2007

This paper develops a new sparse GP approximation which is a combination of both the global and local approaches, and shows that it is derived as a natural extension of the framework developed by Quinonero Candela and Rasmussen for sparse GP approximations. Expand

Sparse Information Filter for Fast Gaussian Process Regression

- 2021

Gaussian processes (GPs) are an important tool in machine learning and applied mathematics with applications ranging from Bayesian optimization to calibration of computer experiments. They constitute… Expand

#### References

SHOWING 1-10 OF 39 REFERENCES

Finite-Dimensional Approximation of Gaussian Processes

- Mathematics, Computer Science
- NIPS
- 1998

This work derives optimal finite-dimensional predictors under a number of assumptions, and shows the superiority of these predictors over the Projected Bayes Regression method (which is asymptotically optimal). Expand

Gaussian Processes for Regression

- Mathematics, Computer Science
- NIPS
- 1995

This paper investigates the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Expand

Efficient Approaches to Gaussian Process Classification

- Mathematics, Computer Science
- NIPS
- 1999

Three simple approximations for the calculation of the posterior mean in Gaussian Process classification are presented, based on Bayesian online approach which was motivated by recent results in the Statistical Mechanics of Neural Networks. Expand

Bayesian Classification With Gaussian Processes

- Mathematics, Computer Science
- IEEE Trans. Pattern Anal. Mach. Intell.
- 1998

A Bayesian treatment is provided, integrating over uncertainty in y and in the parameters that control the Gaussian process prior the necessary integration over y is carried out using Laplace's approximation, and the method is generalized to multiclass problems (m>2) using the softmax function. Expand

Approximation Bounds for Some Sparse Kernel Regression Algorithms

- Mathematics, Computer Science
- Neural Computation
- 2002

Property of certain sparse regression algorithms that approximately solve a gaussian process are investigated and approximation bounds are obtained and results are compared with related methods. Expand

The kernel recursive least-squares algorithm

- Mathematics, Computer Science
- IEEE Transactions on Signal Processing
- 2004

A nonlinear version of the recursive least squares (RLS) algorithm that uses a sequential sparsification process that admits into the kernel representation a new input sample only if its feature space image cannot be sufficiently well approximated by combining the images of previously admitted samples. Expand

Bayesian Model Selection for Support Vector Machines, Gaussian Processes and Other Kernel Classifiers

- Mathematics, Computer Science
- NIPS
- 1999

We present a variational Bayesian method for model selection over families of kernels classifiers like Support Vector machines or Gaussian processes. The algorithm needs no user interaction and is… Expand

Bayesian analysis of the scatterometer wind retrieval inverse problem: some new approaches

- Mathematics
- 2004

The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to infer… Expand

Gaussian processes and SVM: Mean field results and leave-one-out

- Mathematics
- 2000

In this chapter, we elaborate on the well-known relationship between Gaussian processes (GP) and Support Vector Machines (SVM). Secondly, we present approximate solutions for two computational… Expand

The Relevance Vector Machine

- Mathematics, Computer Science
- NIPS
- 1999

The Relevance Vector Machine is introduced, a Bayesian treatment of a generalised linear model of identical functional form to the SVM, and examples demonstrate that for comparable generalisation performance, the RVM requires dramatically fewer kernel functions. Expand