• Publications
  • Influence
Stochastic variational inference
TLDR
Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
Collaborative topic modeling for recommending scientific articles
TLDR
An algorithm to recommend scientific articles to users of an online community that combines the merits of traditional collaborative filtering and probabilistic topic modeling and can form recommendations about both existing and newly published articles is developed.
Reading Tea Leaves: How Humans Interpret Topic Models
TLDR
New quantitative methods for measuring semantic meaning in inferred topics are presented, showing that they capture aspects of the model that are undetected by previous measures of model quality based on held-out likelihood.
Simultaneous image classification and annotation
TLDR
A new probabilistic model for jointly modeling the image, its class label, and its annotations is developed, which derives an approximate inference and estimation algorithms based on variational methods, as well as efficient approximations for classifying and annotating new images.
Online Variational Inference for the Hierarchical Dirichlet Process
TLDR
This work proposes an online variational inference algorithm for the HDP, an algorithm that is easily applicable to massive and streaming data, and lets us analyze much larger data sets.
Asymptotically Exact, Embarrassingly Parallel MCMC
TLDR
This paper presents a parallel Markov chain Monte Carlo (MCMC) algorithm in which subsets of data are processed independently, with very little communication, and proves that it generates asymptotically exact samples and empirically demonstrate its ability to parallelize burn-in and sampling in several models.
The IBP Compound Dirichlet Process and its Application to Focused Topic Modeling
TLDR
The IBP compound Dirichlet process (ICD) is developed, a Bayesian nonparametric prior that decouples across-data prevalence and within-data proportion in a mixed membership model and shows superior performance over the HDP-based topic model.
Jointly modeling aspects, ratings and sentiments for movie recommendation (JMARS)
TLDR
A probabilistic model based on collaborative filtering and topic modeling is proposed that allows it to capture the interest distribution of users and the content distribution for movies; it provides a link between interest and relevance on a per-aspect basis and it allows us to differentiate between positive and negative sentiments on aPer-Aspect basis.
TopicRNN: A Recurrent Neural Network with Long-Range Semantic Dependency
In this paper, we propose TopicRNN, a recurrent neural network (RNN)-based language model designed to directly capture the global semantic meaning relating words in a document via latent topics.
Continuous Time Dynamic Topic Models
TLDR
An efficient variational approximate inference algorithm is derived that takes advantage of the sparsity of observations in text, a property that lets us easily handle many time points.
...
1
2
3
4
...