• Publications
  • Influence
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
TLDR
A new theoretical framework is developed casting dropout training in deep neural networks (NNs) as approximate Bayesian inference in deep Gaussian processes, which mitigates the problem of representing uncertainty in deep learning without sacrificing either computational complexity or test accuracy. Expand
Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions
TLDR
An approach to semi-supervised learning is proposed that is based on a Gaussian random field model, and methods to incorporate class priors and the predictions of classifiers obtained by supervised learning are discussed. Expand
Infinite latent feature models and the Indian buffet process
We define a probability distribution over equivalence classes of binary matrices with a finite number of rows and an unbounded number of columns. This distribution is suitable for use as a prior inExpand
Sparse Gaussian Processes using Pseudo-inputs
TLDR
It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime. Expand
An Introduction to Variational Methods for Graphical Models
TLDR
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality. Expand
Factorial Hidden Markov Models
TLDR
A generalization of HMMs in which this state is factored into multiple state variables and is therefore represented in a distributed manner, and a structured approximation in which the the state variables are decoupled, yielding a tractable algorithm for learning the parameters of the model. Expand
Learning from labeled and unlabeled data with label propagation
TLDR
A simple iterative algorithm to propagate labels through the dataset along high density are as d fined by unlabeled data is proposed and its solution is analyzed, and its connection to several other algorithms is analyzed. Expand
A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
TLDR
This work applies a new variational inference based dropout technique in LSTM and GRU models, which outperforms existing techniques, and to the best of the knowledge improves on the single model state-of-the-art in language modelling with the Penn Treebank. Expand
Kronecker Graphs: An Approach to Modeling Networks
TLDR
Kronecker graphs naturally obey common network properties and it is rigorously proved that they do so, and KRONFIT, a fast and scalable algorithm for fitting the Kronecker graph generation model to large real networks, is presented. Expand
Deep Bayesian Active Learning with Image Data
TLDR
This paper develops an active learning framework for high dimensional data, a task which has been extremely challenging so far, with very sparse existing literature, and demonstrates its active learning techniques with image data, obtaining a significant improvement on existing active learning approaches. Expand
...
1
2
3
4
5
...