Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations
- Honglak Lee, R. Grosse, R. Ranganath, A. Ng
- Computer ScienceInternational Conference on Machine Learning
- 14 June 2009
The convolutional deep belief network is presented, a hierarchical generative model which scales to realistic image sizes and is translation-invariant and supports efficient bottom-up and top-down probabilistic inference.
Black Box Variational Inference
- R. Ranganath, S. Gerrish, D. Blei
- Computer ScienceInternational Conference on Artificial…
- 31 December 2013
This paper presents a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation, based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the Variational distribution.
Automatic Differentiation Variational Inference
- A. Kucukelbir, Dustin Tran, R. Ranganath, A. Gelman, D. Blei
- Computer ScienceJournal of machine learning research
- 2 March 2016
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.
ClinicalBERT: Modeling Clinical Notes and Predicting Hospital Readmission
- Kexin Huang, Jaan Altosaar, R. Ranganath
- MedicineArXiv
- 10 April 2019
ClinicalBERT uncovers high-quality relationships between medical concepts as judged by humans and outperforms baselines on 30-day hospital readmission prediction using both discharge summaries and the first few days of notes in the intensive care unit.
Variational Sequential Monte Carlo
- C. A. Naesseth, Scott W. Linderman, R. Ranganath, D. Blei
- Computer ScienceInternational Conference on Artificial…
- 31 May 2017
The VSMC family is a variational family that can approximate the posterior arbitrarily well, while still allowing for efficient optimization of its parameters, and is demonstrated its utility on state space models, stochastic volatility models for financial data, and deep Markov models of brain neural circuits.
Hierarchical Variational Models
- R. Ranganath, Dustin Tran, D. Blei
- Computer ScienceInternational Conference on Machine Learning
- 7 November 2015
This work develops hierarchical variational models (HVMs), which augment a variational approximation with a prior on its parameters, which allows it to capture complex structure for both discrete and continuous latent variables.
Unsupervised learning of hierarchical representations with convolutional deep belief networks
- Honglak Lee, R. Grosse, R. Ranganath, A. Ng
- Computer ScienceCommunications of the ACM
- 1 October 2011
The convolutional deep belief network is presented, a hierarchical generative model that scales to realistic image sizes and is translation-invariant and supports efficient bottom-up and top-down probabilistic inference.
Automatic Variational Inference in Stan
- A. Kucukelbir, R. Ranganath, A. Gelman, D. Blei
- Computer ScienceNIPS
- 10 June 2015
An automatic variational inference algorithm, automatic differentiation Variational inference (ADVI), which is implemented in Stan, a probabilistic programming system and can be used on any model the authors write in Stan.
Support and Invertibility in Domain-Invariant Representations
- Fredrik D. Johansson, D. Sontag, R. Ranganath
- Computer ScienceInternational Conference on Artificial…
- 8 March 2019
This work gives generalization bounds for unsupervised domain adaptation that hold for any representation function by acknowledging the cost of non-invertibility and proposes a bound based on measuring the extent to which the support of the source domain covers the target domain.
Operator Variational Inference
- R. Ranganath, Dustin Tran, Jaan Altosaar, D. Blei
- Computer ScienceNIPS
- 27 October 2016
A black box algorithm, operator variational inference (OPVI), for optimizing any operator objective, which can characterize different properties of variational objectives, such as objectives that admit data subsampling---allowing inference to scale to massive data---as well as objective that admit variational programs---a rich class of posterior approximations that does not require a tractable density.
...
...