#### Filter Results:

- Full text PDF available (30)

#### Publication Year

2003

2017

- This year (2)
- Last 5 years (25)
- Last 10 years (31)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Honglak Lee, Roger B. Grosse, Rajesh Ranganath, Andrew Y. Ng
- ICML
- 2009

There has been much interest in unsupervised learning of hierarchical generative models such as deep belief networks. Scaling such models to full-sized, high-dimensional images remains a difficult problem. To address this problem, we present the <i>convolutional deep belief network</i>, a hierarchical generative model which scales to realistic image sizes.… (More)

- Rajesh Ranganath, Sean Gerrish, David M. Blei
- AISTATS
- 2014

Variational inference has become a widely used method to approximate posteriors in complex latent variables models. However, deriving a variational inference algorithm generally requires significant model-specific analysis. These efforts can hinder and deter us from quickly developing and exploring a variety of models for a problem at hand. In this paper,… (More)

- Honglak Lee, Roger B. Grosse, Rajesh Ranganath, Andrew Y. Ng
- Commun. ACM
- 2011

There has been much interest in unsupervised learning of hierarchical generative models such as deep belief networks (DBNs); however, scaling such models to full-sized, high-dimensional images remains a difficult problem. To address this problem, we present the <i>convolutional deep belief network</i>, a hierarchical generative model that scales to… (More)

- Dustin Tran, Rajesh Ranganath, David M. Blei
- ArXiv
- 2015

Variational inference is a powerful tool for approximate inference, and it has been recently applied for representation learning with deep generative models. We develop the variational Gaussian process (VGP), a Bayesian nonparametric varia-tional family, which adapts its shape to match complex posterior distributions. The VGP generates approximate posterior… (More)

- Rajesh Ranganath, Dustin Tran, Jaan Altosaar, David M. Blei
- NIPS
- 2016

Variational inference is an umbrella term for algorithms which cast Bayesian inference as optimization. Classically, variational inference uses the Kullback-Leibler divergence to define the optimization. Though this divergence has been widely used, the resultant posterior approximation can suffer from undesirable statistical properties. To address this, we… (More)

- Rajesh Ranganath, Linpeng Tang, Laurent Charlin, David M. Blei
- AISTATS
- 2015

We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent " black box " variational… (More)

- James McInerney, Rajesh Ranganath, David M. Blei
- NIPS
- 2015

Many modern data analysis problems involve inferences from streaming data. However , streaming data is not easily amenable to the standard probabilistic modeling approaches, which require conditioning on finite data. We develop population variational Bayes, a new approach for using Bayesian modeling to analyze streams of data. It approximates a new type of… (More)

- Rajesh Ranganath, Daniel Jurafsky, Daniel A. McFarland
- EMNLP
- 2009

Automatically detecting human social intentions from spoken conversation is an important task for dialogue understanding. Since the social intentions of the speaker may differ from what is perceived by the hearer, systems that analyze human conversations need to be able to extract both the perceived and the intended social meaning. We investigate this… (More)

- Rajesh Ranganath, Dustin Tran, David M. Blei
- ICML
- 2016

Black box variational inference allows researchers to easily prototype and evaluate an array of models. Recent advances allow such algorithms to scale to high dimensions. However , a central question remains: How to specify an expressive variational distribution that maintains efficient computation? To address this, we develop hierarchical variational… (More)

- Prem Gopalan, Francisco J. R. Ruiz, Rajesh Ranganath, David M. Blei
- AISTATS
- 2014

We develop a Bayesian nonparametric Pois-son factorization model for recommendation systems. Poisson factorization implicitly models each user's limited budget of attention (or money) that allows consumption of only a small subset of the available items. In our Bayesian nonparametric variant, the number of latent components is theoretically unbounded and… (More)