Corpus ID: 5652538

Stochastic variational inference

@article{Hoffman2013StochasticVI,
  title={Stochastic variational inference},
  author={Matthew D. Hoffman and David M. Blei and Chong Wang and John William Paisley},
  journal={ArXiv},
  year={2013},
  volume={abs/1206.7051}
}
We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from… Expand
Reliable and Scalable Variational Inference for the Hierarchical Dirichlet Process
TLDR
This work defines full variational posteriors for all latent variables and optimize parameters via a novel surrogate likelihood bound for hierarchical Dirichlet process admixture models and shows that this approach has crucial advantages for data-driven learning of the number of topics. Expand
Stochastic collapsed variational Bayesian inference for latent Dirichlet allocation
TLDR
A stochastic algorithm for collapsed variational Bayesian inference for LDA is proposed, which is simpler and more efficient than the state of the art method and can learn coherent topics in seconds on small corpora, facilitating the use of topic models in interactive document analysis software. Expand
Stochastic Collapsed Variational Inference for Sequential Data
Stochastic variational inference for collapsed models has recently been successfully applied to large scale topic modelling. In this paper, we propose a stochastic collapsed variational inferenceExpand
An Adaptive Learning Rate for Stochastic Variational Inference
TLDR
This work develops an adaptive learning rate for stochastic variational inference, which requires no tuning and is easily implemented with computations already made in the algorithm. Expand
Deterministic Annealing for Stochastic Variational Inference
TLDR
Deterministic annealing for SVI is introduced, which introduces a temperature parameter that deterministically deforms the objective, and then reduces this parameter over the course of the optimization. Expand
Truncation-free stochastic variational inference for Bayesian nonparametric models
We present a truncation-free stochastic variational inference algorithm for Bayesian nonparametric models. While traditional variational inference algorithms require truncations for the model or theExpand
Stochastic gradient variational Bayes for gamma approximating distributions
While stochastic variational inference is relatively well known for scaling inference in Bayesian probabilistic models, related methods also offer ways to circumnavigate the approximation ofExpand
Streaming Variational Inference for Dirichlet Process Mixtures
TLDR
This paper presents two truncation-free variational algorithms, one for mix-membership inference called TFVB (truncation- free variational Bayes), and the other for hard clustering inference calledTFME (trUNCUBE), which further developed a streaming learning framework for the popular Dirichlet process mixture models. Expand
Fast approximation of variational Bayes Dirichlet process mixture using the maximization-maximization algorithm
TLDR
Inspired by fast DPM algorithms, a fast approach to variational inference is proposed using MAP estimation of variational posteriors for approximating expectations and it is observed that some of the analytical solutions obtained by the proposed method are very similar to Variational inference. Expand
Incremental Variational Inference for Latent Dirichlet Allocation
We introduce incremental variational inference and apply it to latent Dirichlet allocation (LDA). Incremental variational inference is inspired by incremental EM and provides an alternative toExpand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 137 REFERENCES
An Adaptive Learning Rate for Stochastic Variational Inference
TLDR
This work develops an adaptive learning rate for stochastic variational inference, which requires no tuning and is easily implemented with computations already made in the algorithm. Expand
Variational Bayesian Inference with Stochastic Search
TLDR
This work presents an alternative algorithm based on stochastic optimization that allows for direct optimization of the variational lower bound and demonstrates the approach on two non-conjugate models: logistic regression and an approximation to the HDP. Expand
Online Variational Inference for the Hierarchical Dirichlet Process
TLDR
This work proposes an online variational inference algorithm for the HDP, an algorithm that is easily applicable to massive and streaming data, and lets us analyze much larger data sets. Expand
Sparse stochastic inference for latent Dirichlet allocation
TLDR
A hybrid algorithm for Bayesian topic models that combines the efficiency of sparse Gibbs sampling with the scalability of online stochastic inference is presented that reduces the bias of variational inference and generalizes to many Bayesian hidden-variable models. Expand
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation
TLDR
This paper proposes the collapsed variational Bayesian inference algorithm for LDA, and shows that it is computationally efficient, easy to implement and significantly more accurate than standard variationalBayesian inference for L DA. Expand
Propagation Algorithms for Variational Bayesian Learning
TLDR
It is demonstrated how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning to infer the hidden state dimensionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set. Expand
Nonparametric variational inference
TLDR
The efficacy of the nonparametric approximation with a hierarchical logistic regression model and a nonlinear matrix factorization model is demonstrated and it is obtained predictive performance as good as or better than more specialized variational methods and MCMC approximations. Expand
On Smoothing and Inference for Topic Models
TLDR
Using the insights gained from this comparative study, it is shown how accurate topic models can be learned in several seconds on text corpora with thousands of documents. Expand
VIBES: A Variational Inference Engine for Bayesian Networks
TLDR
A general purpose inference engine called VIBES ('Variational Inference for Bayesian Networks') which allows a wide variety of probabilistic models to be implemented and solved variationally without recourse to coding. Expand
An Introduction to Variational Methods for Graphical Models
TLDR
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality. Expand
...
1
2
3
4
5
...