Manifold Gaussian Processes for regression

@article{Calandra2016ManifoldGP,
  title={Manifold Gaussian Processes for regression},
  author={Roberto Calandra and Jan Peters and Carl Edward Rasmussen and Marc Peter Deisenroth},
  journal={2016 International Joint Conference on Neural Networks (IJCNN)},
  year={2016},
  pages={3338-3345}
}
Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions on the structure of the function to be modeled. To model complex and non-differentiable functions, these smoothness assumptions are often too restrictive. One way to alleviate this limitation is to find a different representation of the data by introducing a feature space. This feature space is often learned in an unsupervised way, which might lead to data representations that are not useful for the overall… 

Figures and Tables from this paper

Compressed Gaussian Process for Manifold Regression
TLDR
This work proposes an alternative approach relying on random compression of the feature vector combined with Gaussian process regression to solve the problem of nonparametric regression for large numbers of features.
Consistency of Gaussian Process Regression in Metric Spaces
TLDR
This paper provides an important step towards the theoretical legitimization of GP regression on manifolds and other non-Euclidean metric spaces.
High-Dimensional Bayesian Optimization with Manifold Gaussian Processes
TLDR
This work proposes a high-dimensional BO method that learns a nonlinear low-dimensional manifold of the input space with a multi-layer neural network embedded in the covariance function of a Gaussian process that outperforms recent baselines in high- dimensional BO literature on a set of benchmark functions in 60 dimensions.
Meta-Learning Priors for Efficient Online Bayesian Regression
TLDR
The proposed ALPaCA is found to be a promising plug-in tool for many regression tasks in robotics where scalability and data-efficiency are important, and outperforms kernel-based GP regression, as well as state of the art meta-learning approaches.
Gaussian Process with Graph Convolutional Kernel for Relational Learning
TLDR
This work proposes a novel Graph Convolutional Kernel, which enables to incorporate relational structures to feature-based kernels to capture the statistical structure of data and achieves state-of-the-art performance in two relational learning tasks.
Deep Gaussian Covariance Network
TLDR
The basic framework and some extension possibilities of the Deep Gaussian Covariance Network are presented and a comparison to some recent state of the art surrogate model methods will be performed, also for a time dependent problem.
Transforming Gaussian Processes With Normalizing Flows
TLDR
A variational approximation to the resulting Bayesian inference problem is derived, which is as fast as stochastic variational GP regression and makes the model a computationally efficient alternative to other hierarchical extensions of GP priors.
Neural Processes
TLDR
This work introduces a class of neural latent variable models which it calls Neural Processes (NPs), combining the best of both worlds: probabilistic, data-efficient and flexible, however they are also computationally intensive and thus limited in their applicability.
How to Sim2Real with Gaussian Processes: Prior Mean versus Kernels as Priors
TLDR
It is argued that embedding prior knowledge into GP kernels instead provides a more flexible way to capture simulation-based information.
Deep Gaussian processes and variational propagation of uncertainty
TLDR
The results show that the developed variational methodologies improve practical applicability by enabling automatic capacity control in the models, even when data are scarce.
...
...

References

SHOWING 1-10 OF 34 REFERENCES
Warped Gaussian Processes
We generalise the Gaussian process (GP) framework for regression by learning a nonlinear transformation of the GP outputs. This allows for non-Gaussian processes and non-Gaussian noise. The learning
Variable Noise and Dimensionality Reduction for Sparse Gaussian processes
TLDR
The SPGP is addressed by performing automatic dimensionality reduction - a projection of the input space to a low dimensional space is learned in a supervised manner, alongside the pseudo-inputs, which now live in this reduced space.
Input Warping for Bayesian Optimization of Non-Stationary Functions
TLDR
On a set of challenging benchmark optimization tasks, it is observed that the inclusion of warping greatly improves on the state-of-the-art, producing better results faster and more reliably.
Gaussian Process Dynamical Models for Human Motion
TLDR
This work marginalize out the model parameters in closed form by using Gaussian process priors for both the dynamical and the observation mappings, which results in a nonparametric model for dynamical systems that accounts for uncertainty in the model.
Deep Gaussian Processes
TLDR
Deep Gaussian process (GP) models are introduced and model selection by the variational bound shows that a five layer hierarchy is justified even when modelling a digit data set containing only 150 examples.
Gaussian Process Training with Input Noise
TLDR
This work presents a simple yet effective GP model for training on input points corrupted by i.i.d. Gaussian noise, and compares it to others over a range of different regression problems and shows that it improves over current methods.
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
TLDR
A novel probabilistic interpretation of principal component analysis (PCA) that is based on a Gaussian process latent variable model (GP-LVM), and related to popular spectral techniques such as kernel PCA and multidimensional scaling.
Bayesian Gaussian Process Latent Variable Model
TLDR
A variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction and the maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space.
Gaussian Processes for Machine Learning
TLDR
The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.
Nonparametric guidance of autoencoder representations using label information
TLDR
This work proposes a nonparametric approach that uses a Gaussian process to guide the representation of a discriminative function, and demonstrates the superiority of this guidance mechanism on four data sets, including a real-world application to rehabilitation research.
...
...