• Corpus ID: 6072417

Automatic Variational Inference in Stan

@inproceedings{Kucukelbir2015AutomaticVI,
  title={Automatic Variational Inference in Stan},
  author={Alp Kucukelbir and Rajesh Ranganath and Andrew Gelman and David M. Blei},
  booktitle={NIPS},
  year={2015}
}
Variational inference is a scalable technique for approximate Bayesian inference. Deriving variational inference algorithms requires tedious model-specific calculations; this makes it difficult for non-experts to use. We propose an automatic variational inference algorithm, automatic differentiation variational inference (ADVI); we implement it in Stan (code available), a probabilistic programming system. In ADVI the user provides a Bayesian model and a dataset, nothing else. We make no… 
Automatic Differentiation Variational Inference
TLDR
Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.
Automatic Variational ABC
Approximate Bayesian Computation (ABC) is a framework for performing likelihood-free posterior inference for simulation models. Stochastic Variational inference (SVI) is an appealing alternative to
Practical Posterior Error Bounds from Variational Objectives
TLDR
This paper provides rigorous bounds on the error of posterior mean and uncertainty estimates that arise from full-distribution approximations, as in variational inference.
Validated Variational Inference via Practical Posterior Error Bounds
TLDR
This paper provides rigorous bounds on the error of posterior mean and uncertainty estimates that arise from full-distribution approximations, as in variational inference.
Boosting Variational Inference
TLDR
Boosting variational inference is developed, an algorithm that iteratively improves the current approximation by mixing it with a new component from the base distribution family and thereby yields progressively more accurate posterior approximations as more computing time is spent.
2 Variational inference and Gaussian mixtures
Modern Bayesian inference typically requires some form of posterior approximation, and mean-field variational inference (MFVI) is an increasingly popular choice due to its speed. But MFVI can be
Unbiased Implicit Variational Inference
TLDR
UIVI considers an implicit variational distribution obtained in a hierarchical manner using a simple reparameterizable distribution whose variational parameters are defined by arbitrarily flexible deep neural networks and directly optimizes the evidence lower bound (ELBO).
Stochastic gradient variational Bayes for gamma approximating distributions
While stochastic variational inference is relatively well known for scaling inference in Bayesian probabilistic models, related methods also offer ways to circumnavigate the approximation of
Proximity Variational Inference
TLDR
PVI is a new method for optimizing the Variational objective that constrains subsequent iterates of the variational parameters to robustify the optimization path and consistently finds better local optima and gives better predictive performance.
Local Expectation Gradients for Black Box Variational Inference
TLDR
This algorithm divides the problem of estimating the stochastic gradients over multiple variational parameters into smaller sub-tasks so that each sub-task explores wisely the most relevant part of the variational distribution.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 36 REFERENCES
Stochastic variational inference
TLDR
Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.
Black Box Variational Inference
TLDR
This paper presents a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation, based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the Variational distribution.
Auto-Encoding Variational Bayes
TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.
An Introduction to Variational Methods for Graphical Models
TLDR
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality.
Doubly Stochastic Variational Bayes for non-Conjugate Inference
TLDR
A simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate inference in continuous parameter spaces and allows for efficient use of gradient information from the model joint density is proposed.
Variational Message Passing
TLDR
Variational Message Passing is introduced, a general purpose algorithm for applying variational inference to Bayesian Networks and can be applied to very general class of conjugate-exponential models because it uses a factorised variational approximation.
Automated Variational Inference in Probabilistic Programming
TLDR
A new algorithm for approximate inference in probabilistic programs, based on a stochastic gradient for variational programs, is presented, which is efficient without restrictions on the probabilists and improves inference efficiency over other algorithms.
On Using Control Variates with Stochastic Approximation for Variational Bayes and its Connection to Stochastic Linear Regression
Recently, we and several other authors have written about the possibilities of using stochastic approximation techniques for fitting variational approximations to intractable Bayesian posterior
Stochastic Backpropagation and Approximate Inference in Deep Generative Models
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and
Bayesian Inference for Nonnegative Matrix Factorisation Models
  • A. Cemgil
  • Computer Science, Medicine
    Comput. Intell. Neurosci.
  • 2009
TLDR
This work describes nonnegative matrix factorisation with a Kullback-Leibler (KL) error measure in a statistical framework, with a hierarchical generative model consisting of an observation and a prior component, and develops full Bayesian inference via variational Bayes or Monte Carlo.
...
1
2
3
4
...