#### Filter Results:

- Full text PDF available (205)

#### Publication Year

1982

2017

- This year (16)
- Last 5 years (117)
- Last 10 years (182)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- David M. Blei, Andrew Y. Ng, Michael I. Jordan
- Journal of Machine Learning Research
- 2003

We propose a generative model for text and other collections of discrete data that generalizes or improves on several previous models including naive Bayes/unigram, mixture of unigrams [6], and Hofmann's aspect model , also known as probabilistic latent semantic indexing (pLSI) [3]. In the context of text modeling, our model posits that each document is… (More)

We consider problems involving groups of data, where each observation within a group is a draw from a mixture model, and where it is desirable to share mixture components between groups. We assume that the number of mixture components is unknown a priori and is to be inferred from the data. In this setting it is natural to consider sets of Dirichlet… (More)

Observations consisting of measurements on relationships for pairs of objects arise in many settings, such as protein interaction and gene regulatory networks, collections of author-recipient email, and social networks. Analyzing such data with probabilisic models can be delicate because the simple exchangeability assumptions underlying many boilerplate… (More)

- David M. Blei, Jon D. McAuliffe
- NIPS
- 2007

We introduce supervised latent Dirichlet allocation (sLDA), a statistical model of labelled documents. The model accommodates a variety of response types. We derive a maximum-likelihood procedure for parameter estimation, which relies on variational approximations to handle intractable posterior expectations. Prediction problems motivate this research: we… (More)

- David M. Blei, Michael I. Jordan
- SIGIR
- 2003

We consider the problem of modeling annotated data---data with multiple types where the instance of one type (such as a caption) serves as a description of the other type (such as an image). We describe three hierarchical probabilistic mixture models which aim to describe such data, culminating in <i>correspondence latent Dirichlet allocation</i>, a latent… (More)

- David M. Blei, John D. Lafferty
- ICML
- 2006

A family of probabilistic time series models is developed to analyze the time evolution of topics in large document collections. The approach is to use state space models on the natural parameters of the multinomial distributions that represent the topics. Variational approximations based on Kalman filters and nonparametric wavelet regression are developed… (More)

- David M. Blei, Lawrence Carin, David B. Dunson
- IEEE Signal Processing Magazine
- 2010

Probabilistic topic modeling provides a suite of tools for the unsupervised analysis of large collections of documents. Topic modeling algorithms can uncover the underlying themes of a collection and decompose its documents according to those themes. This analysis can be used for corpus exploration, document search, and a variety of prediction problems.
In… (More)

We address the problem of learning topic hierarchies from data. The model selection problem in this domain is daunting—which of the large collection of possible trees to use? We take a Bayesian approach, generating an appropriate prior via a distribution on partitions that we refer to as the nested Chinese restaurant process. This nonparametric prior allows… (More)

- Kobus Barnard, Pinar Duygulu Sahin, David A. Forsyth, Nando de Freitas, David M. Blei, Michael I. Jordan
- Journal of Machine Learning Research
- 2003

We present a new approach for modeling multi-modal data sets, focusing on the specific case of segmented images with associated text. Learning the joint distribution of image regions and words has many applications. We consider in detail predicting words associated with whole images (auto-annotation) and corresponding to particular image regions (region… (More)

- Matthew D. Hoffman, David M. Blei, Francis R. Bach
- NIPS
- 2010

We develop an online variational Bayes (VB) algorithm for Latent Dirichlet Allocation (LDA). Online LDA is based on online stochastic optimization with a natural gradient step, which we show converges to a local optimum of the VB objective function. It can handily analyze massive document collections, including those arriving in a stream. We study the… (More)