Stochastic variational inference

@article{Hoffman2013StochasticVI,
  title={Stochastic variational inference},
  author={Matthew D. Hoffman and David M. Blei and Chong Wang and John William Paisley},
  journal={J. Mach. Learn. Res.},
  year={2013},
  volume={14},
  pages={1303-1347}
}
We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from… 

Reliable and Scalable Variational Inference for the Hierarchical Dirichlet Process

TLDR
This work defines full variational posteriors for all latent variables and optimize parameters via a novel surrogate likelihood bound for hierarchical Dirichlet process admixture models and shows that this approach has crucial advantages for data-driven learning of the number of topics.

Stochastic collapsed variational Bayesian inference for latent Dirichlet allocation

TLDR
A stochastic algorithm for collapsed variational Bayesian inference for LDA is proposed, which is simpler and more efficient than the state of the art method and can learn coherent topics in seconds on small corpora, facilitating the use of topic models in interactive document analysis software.

Stochastic Collapsed Variational Inference for Sequential Data

TLDR
This paper proposes a stochastic collapsed variational inference algorithm in the sequential data setting that is applicable to both finite hidden Markov models and hierarchical Dirichlet process hiddenMarkov models, and to any datasets generated by emission distributions in the exponential family.

An Adaptive Learning Rate for Stochastic Variational Inference

TLDR
This work develops an adaptive learning rate for stochastic variational inference, which requires no tuning and is easily implemented with computations already made in the algorithm.

Deterministic Annealing for Stochastic Variational Inference

TLDR
Deterministic annealing for SVI is introduced, which introduces a temperature parameter that deterministically deforms the objective, and then reduces this parameter over the course of the optimization.

Truncation-free stochastic variational inference for Bayesian nonparametric models

TLDR
This work presents a truncation-free stochastic variational inference algorithm for Bayesian nonparametric models that adapts model complexity on the fly and performs better than previous stochastically variational inferred inference algorithms.

Stochastic gradient variational Bayes for gamma approximating distributions

TLDR
This paper enables straightforward "black box" variational inference in models where sparsity and non-negativity are appropriate, and outperforms generic sampling algorithms and the approach of using Gaussian variational distributions on transformed variables.

Streaming Variational Inference for Dirichlet Process Mixtures

TLDR
This paper presents two truncation-free variational algorithms, one for mix-membership inference called TFVB (truncation- free variational Bayes), and the other for hard clustering inference calledTFME (trUNCUBE), which further developed a streaming learning framework for the popular Dirichlet process mixture models.

Incremental Variational Inference for Latent Dirichlet Allocation

TLDR
A stochastic approximation of incremental variational inference is introduced which extends to the asynchronous distributed setting and the resulting distributed algorithm achieves comparable performance as single host incremental Variational inference, but with a significant speed-up.
...

References

SHOWING 1-10 OF 120 REFERENCES

An Adaptive Learning Rate for Stochastic Variational Inference

TLDR
This work develops an adaptive learning rate for stochastic variational inference, which requires no tuning and is easily implemented with computations already made in the algorithm.

Variational Bayesian Inference with Stochastic Search

TLDR
This work presents an alternative algorithm based on stochastic optimization that allows for direct optimization of the variational lower bound and demonstrates the approach on two non-conjugate models: logistic regression and an approximation to the HDP.

Online Variational Inference for the Hierarchical Dirichlet Process

TLDR
This work proposes an online variational inference algorithm for the HDP, an algorithm that is easily applicable to massive and streaming data, and lets us analyze much larger data sets.

Sparse stochastic inference for latent Dirichlet allocation

TLDR
A hybrid algorithm for Bayesian topic models that combines the efficiency of sparse Gibbs sampling with the scalability of online stochastic inference is presented that reduces the bias of variational inference and generalizes to many Bayesian hidden-variable models.

A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation

TLDR
This paper proposes the collapsed variational Bayesian inference algorithm for LDA, and shows that it is computationally efficient, easy to implement and significantly more accurate than standard variationalBayesian inference for L DA.

Nonparametric variational inference

TLDR
The efficacy of the nonparametric approximation with a hierarchical logistic regression model and a nonlinear matrix factorization model is demonstrated and it is obtained predictive performance as good as or better than more specialized variational methods and MCMC approximations.

On Smoothing and Inference for Topic Models

TLDR
Using the insights gained from this comparative study, it is shown how accurate topic models can be learned in several seconds on text corpora with thousands of documents.

An Introduction to Variational Methods for Graphical Models

TLDR
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality.

VIBES: A Variational Inference Engine for Bayesian Networks

TLDR
A general purpose inference engine called VIBES ('Variational Inference for Bayesian Networks') which allows a wide variety of probabilistic models to be implemented and solved variationally without recourse to coding.

The Discrete Innite Logistic Normal Distribution

TLDR
A stochastic variational inference algorithm for DILN is developed and compared with similar algorithms for HDP and latent Dirichlet allocation on a collection of 350; 000 articles from Nature.
...