Variational Inference for Uncertainty on the Inputs of Gaussian Process Models
@article{Damianou2014VariationalIF, title={Variational Inference for Uncertainty on the Inputs of Gaussian Process Models}, author={Andreas C. Damianou and Michalis K. Titsias and Neil D. Lawrence}, journal={ArXiv}, year={2014}, volume={abs/1409.2287} }
The Gaussian process latent variable model (GP-LVM) provides a flexible approach for non-linear dimensionality reduction that has been widely applied. However, the current approach for training GP-LVMs is based on maximum likelihood, where the latent projection variables are maximized over rather than integrated out. In this paper we present a Bayesian method for training GP-LVMs by introducing a non-standard variational inference framework that allows to approximately integrate out the latent…
Figures and Tables from this paper
25 Citations
Modulating Scalable Gaussian Processes for Expressive Statistical Learning
- Computer SciencePattern Recognit.
- 2021
Dynamical Gaussian Process Latent Variable Model for Representation Learning from Longitudinal Data
- Computer ScienceFODS
- 2020
This work describes an effective approach to learning the parameters of L-GPLVM from sparse observations, by coupling the dynamical model with a Multitask Gaussian Process model for sampling of the missing observations at each step of the gradient-based optimization of the variational lower bound.
Mitigating the Effects of Non-Identifiability on Inference for Bayesian Neural Networks with Latent Variables
- Computer Science
- 2019
A novel inference procedure is developed that explicitly mitigates the effects of likelihood non-identifiability during training and yields high-quality predictions as well as uncertainty estimates and improves upon benchmark methods across a range of synthetic and real data-sets.
On the use of bootstrap with variational inference: Theory, interpretation, and a two-sample test example
- MathematicsThe Annals of Applied Statistics
- 2018
Variational inference is a general approach for approximating complex density functions, such as those arising in latent variable models, popular in machine learning. It has been applied to…
Learning Deep Bayesian Latent Variable Regression Models that Generalize: When Non-identifiability is a Problem
- Computer ScienceArXiv
- 2019
This work develops a novel inference procedure that explicitly mitigates the effects of likelihood non-identifiability during training and yields high quality predictions as well as uncertainty estimates, and improves upon benchmark methods across a range of synthetic and real datasets.
Variational Dependent Multi-output Gaussian Process Dynamical Systems
- Computer ScienceJ. Mach. Learn. Res.
- 2014
The proposed model has superiority on modeling dynamical systems under the more reasonable assumption and the fully Bayesian learning framework and can be flexibly extended to handle regression problems.
The Dynamical Gaussian Process Latent Variable Model in the Longitudinal Scenario
- Computer ScienceArXiv
- 2019
This study approaches the inference of Gaussian Process Dynamical Systems in Longitudinal scenario by augmenting the bound in the variational approximation to include systematic samples of the unseen observations and demonstrates the usefulness of this approach on synthetic as well as the human motion capture data set.
THE DYNAMICAL GAUSSIAN PROCESS LATENT VARI-
- Computer Science
- 2019
This study approaches the inference of Gaussian Process Dynamical Systems in Longitudinal scenario by augmenting the bound in the variational approximation to include systematic samples of the unseen observations, and demonstrates the usefulness of this approach on synthetic as well as the human motion capture data set.
On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes
- Computer ScienceAISTATS
- 2016
A substantial generalization of the literature on variational framework for learning inducing variables is given and a new proof of the result for infinite index sets is given which allows inducing points that are not data points and likelihoods that depend on all function values.
A review on Gaussian Process Latent Variable Models
- Computer ScienceCAAI Trans. Intell. Technol.
- 2016
References
SHOWING 1-10 OF 73 REFERENCES
Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
- Computer ScienceNIPS
- 2014
A novel re-parametrisation of variational inference for sparse GP regression and latent variable models that allows for an efficient distributed algorithm and shows that GPs perform better than many common models often used for big data.
Deep Gaussian Processes
- Computer ScienceAISTATS
- 2013
Deep Gaussian process (GP) models are introduced and model selection by the variational bound shows that a five layer hierarchy is justified even when modelling a digit data set containing only 150 examples.
Probabilistic Non-linear Principal Component Analysis with Gaussian Process Latent Variable Models
- Computer ScienceJ. Mach. Learn. Res.
- 2005
A novel probabilistic interpretation of principal component analysis (PCA) that is based on a Gaussian process latent variable model (GP-LVM), and related to popular spectral techniques such as kernel PCA and multidimensional scaling.
Sparse Gaussian Processes using Pseudo-inputs
- Computer ScienceNIPS
- 2005
It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.
Gaussian processes:iterative sparse approximations
- Computer Science
- 2002
This thesis proposes a two-step solution to construct a probabilistic approximation to the posterior of Gaussian processes, and combines the sparse approximation with an extension to the Bayesian online algorithm that allows multiple iterations for each input and thus approximating a batch solution.
Manifold Relevance Determination
- Computer ScienceICML
- 2012
This paper presents a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation and introduces a relaxation to the discrete segmentation and allow for a "softly" shared latent space.
Gaussian Process Dynamical Models
- Computer ScienceNIPS
- 2005
This paper marginalize out the model parameters in closed-form, using Gaussian Process (GP) priors for both the dynamics and the observation mappings, resulting in a nonparametric model for dynamical systems that accounts for uncertainty in the model.
Gaussian Process Training with Input Noise
- Computer ScienceNIPS
- 2011
This work presents a simple yet effective GP model for training on input points corrupted by i.i.d. Gaussian noise, and compares it to others over a range of different regression problems and shows that it improves over current methods.
Variational Learning of Inducing Variables in Sparse Gaussian Processes
- Computer ScienceAISTATS
- 2009
A variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood.
Fast Variational Inference for Gaussian Process Models Through KL-Correction
- Computer ScienceECML
- 2006
A recently suggested variational approach for approximate inference in Gaussian process (GP) models is reviewed and how convergence may be dramatically improved through the use of a positive correction term to the standard variational bound is reviewed.