• Publications
  • Influence
Variational Learning of Inducing Variables in Sparse Gaussian Processes
  • M. Titsias
  • Mathematics, Computer Science
  • 15 April 2009
We introduce a variational formulation for sparse approximations that jointly infers the inducing inputs and the kernel hyperparameters by maximizing a lower bound of the true log marginal likelihood. Expand
Bayesian Gaussian Process Latent Variable Model
We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. Expand
Variational Heteroscedastic Gaussian Process Regression
We present a non-standard variational approximation that allows accurate inference in heteroscedastic GPs (i.e., under input-dependent noise conditions). Expand
Doubly Stochastic Variational Bayes for non-Conjugate Inference
We propose a simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate inference in continuous parameter spaces. Expand
Spike and Slab Variational Inference for Multi-Task and Multiple Kernel Learning
We introduce a variational Bayesian inference algorithm based on the spike and slab prior which can be widely applied to sparse linear models. Expand
Manifold Relevance Determination
We present a fully Bayesian latent variable model which exploits conditional nonlinear (in-dependence structures to learn an efficient latent representation of multiple views of the data. Expand
SAMHD1 is mutated recurrently in chronic lymphocytic leukemia and is involved in response to DNA damage.
SAMHD1 is a deoxynucleoside triphosphate triphosphohydrolase and a nuclease that restricts HIV-1 in noncycling cells. Germ-line mutations in SAMHD1 have been described in patients withExpand
The Generalized Reparameterization Gradient
In this paper, we introduce the generalized reparameterization gradient, a method that extends the reparametterization gradient to a wider class of variational distributions that weakly depend on the variational parameters. Expand
Variational Gaussian Process Dynamical Systems
We introduce a variational Gaussian process dynamical system that allows for nonlinear dimensionality reduction simultaneously with learning a dynamical prior in the latent space to be automatically determined. Expand
The Infinite Gamma-Poisson Feature Model
  • M. Titsias
  • Computer Science, Mathematics
  • NIPS
  • 3 December 2007
We present a probability distribution over non-negative integer valued matrices with possibly an infinite number of columns. Expand