We introduce a variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood.Expand

We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction.Expand

We present a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under input-dependent noise conditions).Expand

We propose a simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate inference in continuous parameter spaces.Expand

We present a fully Bayesian latent variable model which exploits conditional nonlinear (in-dependence structures to learn an efficient latent representation of multiple views of the data.Expand

SAMHD1 is a deoxynucleoside triphosphate triphosphohydrolase and a nuclease that restricts HIV-1 in noncycling cells. Germ-line mutations in SAMHD1 have been described in patients with… Expand

In this paper, we introduce the generalized reparameterization gradient, a method that extends the reparametterization gradient to a wider class of variational distributions that weakly depend on the variational parameters.Expand

We introduce a variational Gaussian process dynamical system that allows for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space to be automatically determined.Expand