• Publications
  • Influence
Variational Learning of Inducing Variables in Sparse Gaussian Processes
TLDR
A variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood.
Bayesian Gaussian Process Latent Variable Model
TLDR
A variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction and the maximization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the dimensionality of the nonlinear latent space.
Variational Heteroscedastic Gaussian Process Regression
TLDR
This work presents a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under input-dependent noise conditions) and its effectiveness is illustrated on several synthetic and real datasets of diverse characteristics.
Doubly Stochastic Variational Bayes for non-Conjugate Inference
TLDR
A simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate inference in continuous parameter spaces and allows for efficient use of gradient information from the model joint density is proposed.
Spike and Slab Variational Inference for Multi-Task and Multiple Kernel Learning
TLDR
A variational Bayesian inference algorithm which can be widely applied to sparse linear models and is based on the spike and slab prior, which is the golden standard for sparse inference is introduced.
SAMHD1 is mutated recurrently in chronic lymphocytic leukemia and is involved in response to DNA damage.
TLDR
Evidence is provided that SAMHD1 regulates cell proliferation and survival and engages in specific protein interactions in response to DNA damage and that the presence of SAMHD 1 mutations in CLL promotes leukemia development.
Manifold Relevance Determination
TLDR
This paper presents a fully Bayesian latent variable model which exploits conditional nonlinear (in)-dependence structures to learn an efficient latent representation and introduces a relaxation to the discrete segmentation and allow for a "softly" shared latent space.
Variational Model Selection for Sparse Gaussian Process Regression
TLDR
A variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood.
The Generalized Reparameterization Gradient
TLDR
The generalized reparameterization gradient is introduced, a method that extends the reparametersization gradient to a wider class of variational distributions and results in new Monte Carlo gradients that combine reparametization gradients and score function gradients.
Variational Gaussian Process Dynamical Systems
TLDR
This work builds on recent variational approximations for Gaussian process latent variable models to allow for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space.
...
...