Sparse On-Line Gaussian Processes

@article{Csat2002SparseOG,
  title={Sparse On-Line Gaussian Processes},
  author={L. Csat{\'o} and Manfred Opper},
  journal={Neural Computation},
  year={2002},
  volume={14},
  pages={641-668}
}
We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the prediction of the GP model. By using an appealing parameterization and projection techniques in a reproducing kernel Hilbert space… 

Efficient Nonparametric Bayesian Modelling with Sparse Gaussian Process Approximations

A general framework based on the informative vector machine (IVM) is presented and it is shown how the complete Bayesian task of inference and learning of free hyperparameters can be performed in a practically efficient manner.

Fast Sparse Gaussian Process Methods: The Informative Vector Machine

A framework for sparse Gaussian process (GP) methods which uses forward selection with criteria based on information-theoretic principles, which allows for Bayesian model selection and is less complex in implementation is presented.

Sparse Orthogonal Variational Inference for Gaussian Processes

A new interpretation of sparse variational approximations for Gaussian processes using inducing points is introduced, which can lead to more scalable algorithms than previous methods and report state-of-the-art results on CIFAR-10 among purely GP-based models.

MCMC for Variationally Sparse Gaussian Processes

A Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs.

Variational Fourier Features for Gaussian Processes

This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.

Fast Forward Selection to Speed Up Sparse Gaussian Process Regression

A method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection, which leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically.

Sequential, Sparse Learning in Gaussian Processes

This paper presents a recently developed Bayesian method for estimating the mean and covariance structures of a Gaussian process using a sequential learning algorithm which attempts to minimise the relative entropy between the true posterior process and the approximatingGaussian process.

Sparse Spectrum Gaussian Process Regression

The achievable trade-offs between predictive accuracy and computational requirements are compared, and it is shown that these are typically superior to existing state-of-the-art sparse approximations.

Local and global sparse Gaussian process approximations

This paper develops a new sparse GP approximation which is a combination of both the global and local approaches, and shows that it is derived as a natural extension of the framework developed by Quinonero Candela and Rasmussen for sparse GP approximations.

Sparse Information Filter for Fast Gaussian Process Regression

This paper focuses on GP regression tasks and proposes a new algorithm to train variational sparse GP models and achieves comparable performances to SVGP and SOLVEGP while providing considerable speed-ups.
...

References

SHOWING 1-10 OF 37 REFERENCES

Finite-Dimensional Approximation of Gaussian Processes

This work derives optimal finite-dimensional predictors under a number of assumptions, and shows the superiority of these predictors over the Projected Bayes Regression method (which is asymptotically optimal).

Gaussian Processes for Regression

This paper investigates the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations.

Efficient Approaches to Gaussian Process Classification

Three simple approximations for the calculation of the posterior mean in Gaussian Process classification are presented, based on Bayesian online approach which was motivated by recent results in the Statistical Mechanics of Neural Networks.

Bayesian Classification With Gaussian Processes

A Bayesian treatment is provided, integrating over uncertainty in y and in the parameters that control the Gaussian process prior the necessary integration over y is carried out using Laplace's approximation, and the method is generalized to multiclass problems (m>2) using the softmax function.

Approximation Bounds for Some Sparse Kernel Regression Algorithms

Property of certain sparse regression algorithms that approximately solve a gaussian process are investigated and approximation bounds are obtained and results are compared with related methods.

The kernel recursive least-squares algorithm

A nonlinear version of the recursive least squares (RLS) algorithm that uses a sequential sparsification process that admits into the kernel representation a new input sample only if its feature space image cannot be sufficiently well approximated by combining the images of previously admitted samples.

Bayesian Model Selection for Support Vector Machines, Gaussian Processes and Other Kernel Classifiers

We present a variational Bayesian method for model selection over families of kernels classifiers like Support Vector machines or Gaussian processes. The algorithm needs no user interaction and is

Bayesian analysis of the scatterometer wind retrieval inverse problem: some new approaches

It is shown how Gaussian process priors can be used efficiently with a variety of likelihood models, using local forward (observation) models and direct inverse models for the scatterometer, and an enhanced Markov chain Monte Carlo method is presented to sample from the resulting multimodal posterior distribution.

Gaussian processes and SVM: Mean field results and leave-one-out

In this chapter, we elaborate on the well-known relationship between Gaussian processes (GP) and Support Vector Machines (SVM). Secondly, we present approximate solutions for two computational

The Relevance Vector Machine

The Relevance Vector Machine is introduced, a Bayesian treatment of a generalised linear model of identical functional form to the SVM, and examples demonstrate that for comparable generalisation performance, the RVM requires dramatically fewer kernel functions.