• Publications
  • Influence
Semi-Supervised Learning Using Gaussian Fields and Harmonic Functions
An approach to semi-supervised learning is proposed that is based on a Gaussian random field model. Labeled and unlabeled data are represented as vertices in a weighted graph, with edge weightsExpand
  • 3,360
  • 485
  • PDF
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Deep learning tools have gained tremendous attention in applied machine learning. However such tools for regression and classification do not capture model uncertainty. In comparison, Bayesian modelsExpand
  • 2,229
  • 401
  • PDF
Infinite latent feature models and the Indian buffet process
We define a probability distribution over equivalence classes of binary matrices with a finite number of rows and an unbounded number of columns. This distribution is suitable for use as a prior inExpand
  • 746
  • 186
  • PDF
Factorial Hidden Markov Models
Hidden Markov models (HMMs) have proven to be one of the most widely used tools for learning probabilistic models of time series data. In an HMM, information about the past is conveyed through aExpand
  • 1,147
  • 158
  • PDF
Sparse Gaussian Processes using Pseudo-inputs
We present a new Gaussian process (GP) regression model whose co-variance is parameterized by the the locations of M pseudo-input points, which we learn by a gradient based optimization. We take M ≪Expand
  • 1,217
  • 155
  • PDF
An Introduction to Variational Methods for Graphical Models
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields). We present a number ofExpand
  • 2,236
  • 135
  • PDF
Learning from labeled and unlabeled data with label propagation
We investigate the use of unlabeled data to help labeled data in cl ssification. We propose a simple iterative algorithm, label pro pagation, to propagate labels through the dataset along highExpand
  • 1,165
  • 130
An Introduction to Variational Methods for Graphical Models
  • 1,442
  • 118
Kronecker Graphs: An Approach to Modeling Networks
How can we generate realistic networks? In addition, how can we do so with a mathematically tractable model that allows for rigorous analysis of network properties? Real networks exhibit a long listExpand
  • 804
  • 114
  • PDF
A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
Recurrent neural networks (RNNs) stand at the forefront of many recent developments in deep learning. Yet a major difficulty with these models is their tendency to overfit, with dropout shown to failExpand
  • 1,000
  • 107
  • PDF