Sparse On-Line Gaussian Processes

@article{Csat2002SparseOG,
  title={Sparse On-Line Gaussian Processes},
  author={L. Csat{\'o} and M. Opper},
  journal={Neural Computation},
  year={2002},
  volume={14},
  pages={641-668}
}
  • L. Csató, M. Opper
  • Published 2002
  • Mathematics, Computer Science, Medicine
  • Neural Computation
We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the prediction of the GP model. By using an appealing parameterization and projection techniques in a reproducing kernel Hilbert space… Expand
Fast Sparse Gaussian Process Methods: The Informative Vector Machine
TLDR
A framework for sparse Gaussian process (GP) methods which uses forward selection with criteria based on information-theoretic principles, which allows for Bayesian model selection and is less complex in implementation is presented. Expand
Efficient Nonparametric Bayesian Modelling with Sparse Gaussian Process Approximations
Sparse approximations to Bayesian inference for nonparametric Gaussian Process models scale linearly in the number of training points, allowing for the application of powerful kernel-based models toExpand
Sparse Orthogonal Variational Inference for Gaussian Processes
TLDR
A new interpretation of sparse variational approximations for Gaussian processes using inducing points is introduced, which can lead to more scalable algorithms than previous methods and report state-of-the-art results on CIFAR-10 among purely GP-based models. Expand
MCMC for Variationally Sparse Gaussian Processes
TLDR
A Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs. Expand
Fast Forward Selection to Speed Up Sparse Gaussian Process Regression
TLDR
A method for the sparse greedy approximation of Bayesian Gaussian process regression, featuring a novel heuristic for very fast forward selection, which leads to a sufficiently stable approximation of the log marginal likelihood of the training data, which can be optimised to adjust a large number of hyperparameters automatically. Expand
Variational Fourier Features for Gaussian Processes
TLDR
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures. Expand
Sequential, Sparse Learning in Gaussian Processes
The application of Gaussian processes (or Gaussian random field models in the spatial context) has historically been limited to datasets of a small size. This limitation is imposed by the requirementExpand
Sparse Spectrum Gaussian Process Regression
TLDR
The achievable trade-offs between predictive accuracy and computational requirements are compared, and it is shown that these are typically superior to existing state-of-the-art sparse approximations. Expand
Local and global sparse Gaussian process approximations
TLDR
This paper develops a new sparse GP approximation which is a combination of both the global and local approaches, and shows that it is derived as a natural extension of the framework developed by Quinonero Candela and Rasmussen for sparse GP approximations. Expand
Sparse Information Filter for Fast Gaussian Process Regression
Gaussian processes (GPs) are an important tool in machine learning and applied mathematics with applications ranging from Bayesian optimization to calibration of computer experiments. They constituteExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Finite-Dimensional Approximation of Gaussian Processes
TLDR
This work derives optimal finite-dimensional predictors under a number of assumptions, and shows the superiority of these predictors over the Projected Bayes Regression method (which is asymptotically optimal). Expand
Gaussian Processes for Regression
TLDR
This paper investigates the use of Gaussian process priors over functions, which permit the predictive Bayesian analysis for fixed values of hyperparameters to be carried out exactly using matrix operations. Expand
Efficient Approaches to Gaussian Process Classification
TLDR
Three simple approximations for the calculation of the posterior mean in Gaussian Process classification are presented, based on Bayesian online approach which was motivated by recent results in the Statistical Mechanics of Neural Networks. Expand
Bayesian Classification With Gaussian Processes
TLDR
A Bayesian treatment is provided, integrating over uncertainty in y and in the parameters that control the Gaussian process prior the necessary integration over y is carried out using Laplace's approximation, and the method is generalized to multiclass problems (m>2) using the softmax function. Expand
Approximation Bounds for Some Sparse Kernel Regression Algorithms
  • Tong Zhang
  • Mathematics, Computer Science
  • Neural Computation
  • 2002
TLDR
Property of certain sparse regression algorithms that approximately solve a gaussian process are investigated and approximation bounds are obtained and results are compared with related methods. Expand
The kernel recursive least-squares algorithm
TLDR
A nonlinear version of the recursive least squares (RLS) algorithm that uses a sequential sparsification process that admits into the kernel representation a new input sample only if its feature space image cannot be sufficiently well approximated by combining the images of previously admitted samples. Expand
Bayesian Model Selection for Support Vector Machines, Gaussian Processes and Other Kernel Classifiers
  • M. Seeger
  • Mathematics, Computer Science
  • NIPS
  • 1999
We present a variational Bayesian method for model selection over families of kernels classifiers like Support Vector machines or Gaussian processes. The algorithm needs no user interaction and isExpand
Bayesian analysis of the scatterometer wind retrieval inverse problem: some new approaches
The retrieval of wind vectors from satellite scatterometer observations is a non-linear inverse problem. A common approach to solving inverse problems is to adopt a Bayesian framework and to inferExpand
Gaussian processes and SVM: Mean field results and leave-one-out
In this chapter, we elaborate on the well-known relationship between Gaussian processes (GP) and Support Vector Machines (SVM). Secondly, we present approximate solutions for two computationalExpand
The Relevance Vector Machine
TLDR
The Relevance Vector Machine is introduced, a Bayesian treatment of a generalised linear model of identical functional form to the SVM, and examples demonstrate that for comparable generalisation performance, the RVM requires dramatically fewer kernel functions. Expand
...
1
2
3
4
...