# Manifold Gaussian Processes for regression

@article{Calandra2016ManifoldGP, title={Manifold Gaussian Processes for regression}, author={Roberto Calandra and Jan Peters and Carl Edward Rasmussen and Marc Peter Deisenroth}, journal={2016 International Joint Conference on Neural Networks (IJCNN)}, year={2016}, pages={3338-3345} }

Off-the-shelf Gaussian Process (GP) covariance functions encode smoothness assumptions on the structure of the function to be modeled. To model complex and non-differentiable functions, these smoothness assumptions are often too restrictive. One way to alleviate this limitation is to find a different representation of the data by introducing a feature space. This feature space is often learned in an unsupervised way, which might lead to data representations that are not useful for the overall…

## Figures and Tables from this paper

## 194 Citations

Compressed Gaussian Process for Manifold Regression

- Computer ScienceJ. Mach. Learn. Res.
- 2016

This work proposes an alternative approach relying on random compression of the feature vector combined with Gaussian process regression to solve the problem of nonparametric regression for large numbers of features.

Consistency of Gaussian Process Regression in Metric Spaces

- Mathematics, Computer ScienceJ. Mach. Learn. Res.
- 2021

This paper provides an important step towards the theoretical legitimization of GP regression on manifolds and other non-Euclidean metric spaces.

High-Dimensional Bayesian Optimization with Manifold Gaussian Processes

- Computer ScienceArXiv
- 2019

This work proposes a high-dimensional BO method that learns a nonlinear low-dimensional manifold of the input space with a multi-layer neural network embedded in the covariance function of a Gaussian process that outperforms recent baselines in high- dimensional BO literature on a set of benchmark functions in 60 dimensions.

Meta-Learning Priors for Efficient Online Bayesian Regression

- Computer ScienceWAFR
- 2018

The proposed ALPaCA is found to be a promising plug-in tool for many regression tasks in robotics where scalability and data-efficiency are important, and outperforms kernel-based GP regression, as well as state of the art meta-learning approaches.

Gaussian Process with Graph Convolutional Kernel for Relational Learning

- Computer ScienceKDD
- 2021

This work proposes a novel Graph Convolutional Kernel, which enables to incorporate relational structures to feature-based kernels to capture the statistical structure of data and achieves state-of-the-art performance in two relational learning tasks.

Deep Gaussian Covariance Network

- Computer ScienceArXiv
- 2017

The basic framework and some extension possibilities of the Deep Gaussian Covariance Network are presented and a comparison to some recent state of the art surrogate model methods will be performed, also for a time dependent problem.

Transforming Gaussian Processes With Normalizing Flows

- Computer ScienceAISTATS
- 2021

A variational approximation to the resulting Bayesian inference problem is derived, which is as fast as stochastic variational GP regression and makes the model a computationally efficient alternative to other hierarchical extensions of GP priors.

Neural Processes

- Computer Science, BiologyArXiv
- 2018

This work introduces a class of neural latent variable models which it calls Neural Processes (NPs), combining the best of both worlds: probabilistic, data-efficient and flexible, however they are also computationally intensive and thus limited in their applicability.

How to Sim2Real with Gaussian Processes: Prior Mean versus Kernels as Priors

- Computer Science
- 2020

It is argued that embedding prior knowledge into GP kernels instead provides a more flexible way to capture simulation-based information.

Deep Gaussian processes and variational propagation of uncertainty

- Computer Science
- 2015

The results show that the developed variational methodologies improve practical applicability by enabling automatic capacity control in the models, even when data are scarce.

## References

SHOWING 1-10 OF 34 REFERENCES

Warped Gaussian Processes

- Computer ScienceNIPS
- 2003

We generalise the Gaussian process (GP) framework for regression by learning a nonlinear transformation of the GP outputs. This allows for non-Gaussian processes and non-Gaussian noise. The learning…

Variable Noise and Dimensionality Reduction for Sparse Gaussian processes

- Computer ScienceUAI
- 2006

The SPGP is addressed by performing automatic dimensionality reduction - a projection of the input space to a low dimensional space is learned in a supervised manner, alongside the pseudo-inputs, which now live in this reduced space.

Input Warping for Bayesian Optimization of Non-Stationary Functions

- Computer ScienceICML
- 2014

On a set of challenging benchmark optimization tasks, it is observed that the inclusion of warping greatly improves on the state-of-the-art, producing better results faster and more reliably.

Gaussian Process Dynamical Models for Human Motion

- Computer ScienceIEEE Transactions on Pattern Analysis and Machine Intelligence
- 2008

This work marginalize out the model parameters in closed form by using Gaussian process priors for both the dynamical and the observation mappings, which results in a nonparametric model for dynamical systems that accounts for uncertainty in the model.

Deep Gaussian Processes

- Computer ScienceAISTATS
- 2013

Deep Gaussian process (GP) models are introduced and model selection by the variational bound shows that a five layer hierarchy is justified even when modelling a digit data set containing only 150 examples.

Gaussian Process Training with Input Noise

- Computer ScienceNIPS
- 2011

This work presents a simple yet effective GP model for training on input points corrupted by i.i.d. Gaussian noise, and compares it to others over a range of different regression problems and shows that it improves over current methods.

Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models

- Computer ScienceJ. Mach. Learn. Res.
- 2005

A novel probabilistic interpretation of principal component analysis (PCA) that is based on a Gaussian process latent variable model (GP-LVM), and related to popular spectral techniques such as kernel PCA and multidimensional scaling.

Bayesian Gaussian Process Latent Variable Model

- Computer ScienceAISTATS
- 2010

A variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction and the maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space.

Gaussian Processes for Machine Learning

- Computer ScienceAdaptive computation and machine learning
- 2009

The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and deals with the supervised learning problem for both regression and classification.

Nonparametric guidance of autoencoder representations using label information

- Computer ScienceJ. Mach. Learn. Res.
- 2012

This work proposes a nonparametric approach that uses a Gaussian process to guide the representation of a discriminative function, and demonstrates the superiority of this guidance mechanism on four data sets, including a real-world application to rehabilitation research.