# Automatic Variational Inference in Stan

@inproceedings{Kucukelbir2015AutomaticVI, title={Automatic Variational Inference in Stan}, author={Alp Kucukelbir and Rajesh Ranganath and Andrew Gelman and David M. Blei}, booktitle={NIPS}, year={2015} }

Variational inference is a scalable technique for approximate Bayesian inference. Deriving variational inference algorithms requires tedious model-specific calculations; this makes it difficult for non-experts to use. We propose an automatic variational inference algorithm, automatic differentiation variational inference (ADVI); we implement it in Stan (code available), a probabilistic programming system. In ADVI the user provides a Bayesian model and a dataset, nothing else. We make no…

## Figures and Topics from this paper

## 184 Citations

Automatic Differentiation Variational Inference

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2017

Automatic differentiation variational inference (ADVI) is developed, where the scientist only provides a probabilistic model and a dataset, nothing else, and the algorithm automatically derives an efficient Variational inference algorithm, freeing the scientist to refine and explore many models.

Automatic Variational ABC

- Mathematics
- 2016

Approximate Bayesian Computation (ABC) is a framework for performing likelihood-free posterior inference for simulation models. Stochastic Variational inference (SVI) is an appealing alternative to…

Practical Posterior Error Bounds from Variational Objectives

- Computer ScienceArXiv
- 2019

This paper provides rigorous bounds on the error of posterior mean and uncertainty estimates that arise from full-distribution approximations, as in variational inference.

Validated Variational Inference via Practical Posterior Error Bounds

- Computer Science, MathematicsAISTATS
- 2020

This paper provides rigorous bounds on the error of posterior mean and uncertainty estimates that arise from full-distribution approximations, as in variational inference.

Boosting Variational Inference

- Mathematics, Computer ScienceArXiv
- 2016

Boosting variational inference is developed, an algorithm that iteratively improves the current approximation by mixing it with a new component from the base distribution family and thereby yields progressively more accurate posterior approximations as more computing time is spent.

2 Variational inference and Gaussian mixtures

- 2016

Modern Bayesian inference typically requires some form of posterior approximation, and mean-field variational inference (MFVI) is an increasingly popular choice due to its speed. But MFVI can be…

Unbiased Implicit Variational Inference

- Computer Science, MathematicsAISTATS
- 2019

UIVI considers an implicit variational distribution obtained in a hierarchical manner using a simple reparameterizable distribution whose variational parameters are defined by arbitrarily flexible deep neural networks and directly optimizes the evidence lower bound (ELBO).

Stochastic gradient variational Bayes for gamma approximating distributions

- Mathematics
- 2015

While stochastic variational inference is relatively well known for scaling inference in Bayesian probabilistic models, related methods also offer ways to circumnavigate the approximation of…

Proximity Variational Inference

- Mathematics, Computer ScienceAISTATS
- 2018

PVI is a new method for optimizing the Variational objective that constrains subsequent iterates of the variational parameters to robustify the optimization path and consistently finds better local optima and gives better predictive performance.

Local Expectation Gradients for Black Box Variational Inference

- Computer Science, MathematicsNIPS
- 2015

This algorithm divides the problem of estimating the stochastic gradients over multiple variational parameters into smaller sub-tasks so that each sub-task explores wisely the most relevant part of the variational distribution.

## References

SHOWING 1-10 OF 36 REFERENCES

Stochastic variational inference

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2013

Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart.

Black Box Variational Inference

- Mathematics, Computer ScienceAISTATS
- 2014

This paper presents a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation, based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the Variational distribution.

Auto-Encoding Variational Bayes

- Mathematics, Computer ScienceICLR
- 2014

A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

An Introduction to Variational Methods for Graphical Models

- Mathematics, Computer ScienceMachine Learning
- 2004

This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality.

Doubly Stochastic Variational Bayes for non-Conjugate Inference

- Mathematics, Computer ScienceICML
- 2014

A simple and effective variational inference algorithm based on stochastic optimisation that can be widely applied for Bayesian non-conjugate inference in continuous parameter spaces and allows for efficient use of gradient information from the model joint density is proposed.

Variational Message Passing

- Mathematics, Computer ScienceJ. Mach. Learn. Res.
- 2005

Variational Message Passing is introduced, a general purpose algorithm for applying variational inference to Bayesian Networks and can be applied to very general class of conjugate-exponential models because it uses a factorised variational approximation.

Automated Variational Inference in Probabilistic Programming

- Mathematics, Computer ScienceArXiv
- 2013

A new algorithm for approximate inference in probabilistic programs, based on a stochastic gradient for variational programs, is presented, which is efficient without restrictions on the probabilists and improves inference efficiency over other algorithms.

On Using Control Variates with Stochastic Approximation for Variational Bayes and its Connection to Stochastic Linear Regression

- Mathematics
- 2014

Recently, we and several other authors have written about the possibilities of using stochastic approximation techniques for fitting variational approximations to intractable Bayesian posterior…

Stochastic Backpropagation and Approximate Inference in Deep Generative Models

- Computer Science, MathematicsICML
- 2014

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and…

Bayesian Inference for Nonnegative Matrix Factorisation Models

- Computer Science, MedicineComput. Intell. Neurosci.
- 2009

This work describes nonnegative matrix factorisation with a Kullback-Leibler (KL) error measure in a statistical framework, with a hierarchical generative model consisting of an observation and a prior component, and develops full Bayesian inference via variational Bayes or Monte Carlo.