Corpus ID: 17261541

Automated Variational Inference in Probabilistic Programming

@article{Wingate2013AutomatedVI,
  title={Automated Variational Inference in Probabilistic Programming},
  author={David Wingate and Th{\'e}ophane Weber},
  journal={ArXiv},
  year={2013},
  volume={abs/1301.1299}
}
We present a new algorithm for approximate inference in probabilistic programs, based on a stochastic gradient for variational programs. This method is efficient without restrictions on the probabilistic program; it is particularly practical for distributions which are not analytically tractable, including highly structured distributions that arise in probabilistic programs. We show how to automatically derive mean-field probabilistic programs and optimize them, and demonstrate that our… Expand
Effective Monte Carlo Variational Inference for Binary-Variable Probabilistic Programs
We propose a broadly applicable variational inference algorithm for probabilistic models with binary latent variables, using sampling to approximate expectations required for coordinate ascentExpand
Symbolic Exact Inference for Discrete Probabilistic Programs
TLDR
This work provides a semantic and algorithmic foundation for efficient exact inference on discrete-valued finite-domain imperative probabilistic programs and shows that the inference approach is competitive with inference procedures specialized for Bayesian networks, thereby expanding the class of probabilism programs that can be practically analyzed. Expand
Towards verified stochastic variational inference for probabilistic programs
TLDR
This paper analyses one of the most fundamental and versatile variational inference algorithms, called score estimator or REINFORCE, using tools from denotational semantics and program analysis, and formally expresses what this algorithm does on models denoted by programs, and exposes implicit assumptions made by the algorithm on the models. Expand
Automatic Variational Inference in Stan
TLDR
An automatic variational inference algorithm, automatic differentiation Variational inference (ADVI), which is implemented in Stan, a probabilistic programming system and can be used on any model the authors write in Stan. Expand
Stochastically Differentiable Probabilistic Programs
TLDR
This work presents a novel approach to run inference efficiently and robustly in probabilistic programs with mixed support using stochastic gradient Markov Chain Monte Carlo family of algorithms and demonstrates that it outperforms existing composing inference baselines and works almost as well as inference in marginalized versions of the programs. Expand
Deep Amortized Inference for Probabilistic Programs
TLDR
A system for amortized inference in PPLs is proposed in the form of a parameterized guide program, which explores in detail the common machine learning pattern in which a 'local' model is specified by 'global' random values and used to generate independent observed data points; this gives rise to amortization local inference supporting global model learning. Expand
Automatic variational inference with cascading flows
TLDR
Cascading flows are introduced, a new family of variational programs that can be constructed automatically from an input probabilistic program and can also be amortized automatically that have much higher performance than both normalizing flows and ASVI in a large set of structured inference problems. Expand
Probabilistic Models with Deep Neural Networks
TLDR
An overview of the main concepts, methods, and tools needed to use deep neural networks within a probabilistic modeling framework is provided. Expand
Reinforced Variational Inference
TLDR
The problem of learning approximate posterior distributions in variational inference (VI) onto the policy optimization problem in reinforcement learning (RL) is formally mapped, explaining this connection at two levels. Expand
Applications of Probabilistic Programming (Master's thesis, 2015)
TLDR
This thesis describes work on the learning of probabilistic program code given specifications, in particular program code of one-dimensional samplers; and the facilitation of sequential Monte Carlo inference with help of data-driven proposals. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 47 REFERENCES
Stochastic variational inference
TLDR
Stochastic variational inference lets us apply complex Bayesian models to massive data sets, and it is shown that the Bayesian nonparametric topic model outperforms its parametric counterpart. Expand
A Stochastic approximation method for inference in probabilistic graphical models
TLDR
A new algorithmic framework for inference in probabilistic models, and applies it to inference for latent Dirichlet allocation (LDA), which offers a principled means to exchange the variance of an importance sampling estimate for the bias incurred through variational approximation. Expand
Structured Variational Inference Procedures and their Realizations
We describe and prove the convergence of several algorithms for approximate structured variational inference. We discuss the computation cost of these algorithms and describe their relationship toExpand
Variational Program Inference
TLDR
The guide program is used as a proposal distribution in importance sampling to statistically prove lower bounds on the probability of the evidence and on the probabilities of a hypothesis and the evidence. Expand
Natural Conjugate Gradient in Variational Inference
TLDR
This work proposes using the geometry of the variational approximating distribution instead to speed up a conjugate gradient method for variational learning and inference, and shows significant speedups over alternative learning algorithms. Expand
An Introduction to Variational Methods for Graphical Models
TLDR
This paper presents a tutorial introduction to the use of variational methods for inference and learning in graphical models (Bayesian networks and Markov random fields), and describes a general framework for generating variational transformations based on convex duality. Expand
Structured Variational Distributions in VIBES
TLDR
This paper presents an extension of VIBES in which the variational posterior distribution corresponds to a sub-graph of the full probabilistic model, which can produce much closer approximations to the true posterior distribution. Expand
Variational algorithms for approximate Bayesian inference
TLDR
A unified variational Bayesian (VB) framework which approximates computations in models with latent variables using a lower bound on the marginal likelihood and is compared to other methods including sampling, Cheeseman-Stutz, and asymptotic approximations such as BIC. Expand
Hybrid Variational/Gibbs Collapsed Inference in Topic Models
TLDR
This paper proposes a hybrid algorithm that combines the best of both worlds: it samples very small counts and applies variational updates to large counts and is shown to significantly improve test-set perplexity relative to variational inference at no computational cost. Expand
Report on the probabilistic language scheme
TLDR
Probabilistic Scheme is presented, an embedding of probabilistic computation into Scheme that gives programmers an expressive language for implementing modular Probabilistic models that integrate naturally with the rest of Scheme. Expand
...
1
2
3
4
5
...