Share This Author
Gaussian Processes for Big Data
Stochastic variational inference for Gaussian process models is introduced and it is shown how GPs can be variationally decomposed to depend on a set of globally relevant inducing variables which factorize the model in the necessary manner to perform Variational inference.
Scalable Variational Gaussian Process Classification
This work shows how to scale the model within a variational inducing point framework, outperforming the state of the art on benchmark datasets, and can be exploited to allow classification in problems with millions of data points.
GPflow: A Gaussian Process Library using TensorFlow
- A. G. D. G. Matthews, Mark van der Wilk, J. Hensman
- Computer ScienceJ. Mach. Learn. Res.
- 27 October 2016
GPflow is a Gaussian process library that uses TensorFlow for its core computations and Python for its front end and has a particular emphasis on software testing and is able to exploit GPU hardware.
MCMC for Variationally Sparse Gaussian Processes
A Hybrid Monte-Carlo sampling scheme which allows for a non-Gaussian approximation over the function values and covariance parameters simultaneously, with efficient computations based on inducing-point sparse GPs.
Variational Fourier Features for Gaussian Processes
This work hinges on a key result that there exist spectral features related to a finite domain of the Gaussian process which exhibit almost-independent covariances, and derives these expressions for Matern kernels in one dimension, and generalize to more dimensions using kernels with specific structures.
Convolutional Gaussian Processes
It is shown how the marginal likelihood can be used to find an optimal weighting between convolutional and RBF kernels to further improve performance, and it is hoped that this illustration of the usefulness of a marginal likelihood will help automate discovering architectures in larger models.
On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes
- A. G. D. G. Matthews, J. Hensman, Richard E. Turner, Zoubin Ghahramani
- Computer ScienceAISTATS
- 27 April 2015
A substantial generalization of the literature on variational framework for learning inducing variables is given and a new proof of the result for infinite index sets is given which allows inducing points that are not data points and likelihoods that depend on all function values.
Natural Gradients in Practice: Non-Conjugate Variational Inference in Gaussian Process Models
It is shown how natural gradients can be computed efficiently and automatically in any parameterization, using automatic differentiation, and concluded that the natural gradient can significantly improve performance in terms of wall-clock time.
Nested Variational Compression in Deep Gaussian Processes
This paper extends variational compression to allow for approximate variational marginalization of the hidden variables leading to a lower bound on the marginal likelihood of the model, which can be easily parallelized or adapted for stochastic variational inference.
Chained Gaussian Processes
An approximate inference procedure for Chained Gaussian Processes is developed that is scalable and applicable to any factorized likelihood functions, and the approximation on a range of likelihood functions is demonstrated.