Attention for Inference Compilation

@inproceedings{Harvey2019AttentionFI,
  title={Attention for Inference Compilation},
  author={William Harvey and Andreas Munk and Atilim Gunecs Baydin and Alexander Bergholm and Frank D. Wood},
  booktitle={International Conference on Simulation and Modeling Methodologies, Technologies and Applications},
  year={2019}
}
We present a new approach to automatic amortized inference in universal probabilistic programs which improves performance compared to current methods. Our approach is a variation of inference compilation (IC) which leverages deep neural networks to approximate a posterior distribution over latent variables in a probabilistic program. A challenge with existing IC network architectures is that they can fail to model long-range dependencies between latent variables. To address this, we introduce… 

Figures from this paper

Accelerating Metropolis-Hastings with Lightweight Inference Compilation

Experimental results show Lightweight Inference Compilation can produce proposers which have less parameters, greater robustness to nuisance random variables, and improved posterior sampling in a Bayesian logistic regression and $n$-schools inference application.

Type-Preserving, Dependence-Aware Guide Generation for Sound, Effective Amortized Probabilistic Inference

Despite the control-flow expressiveness allowed by the universal PPL, generated guides are guaranteed to satisfy a critical soundness condition and moreover, consistently improve training and inference over state-of-the-art baselines for a suite of benchmarks.

Probabilistic surrogate networks for simulators with unbounded randomness

We present a framework for automatically struc-turing and training fast, approximate, deep neural surrogates of stochastic simulators. Unlike tradi-tional approaches to surrogate modeling, our

Spacecraft Collision Risk Assessment with Probabilistic Programming

It is shown that the probabilistic programming approach to conjunction assessment can help in making predictions and in finding the parameters that explain the observed data in conjunction data messages, thus shedding more light on key variables and orbital characteristics that more likely lead to conjunction events.

References

SHOWING 1-10 OF 27 REFERENCES

Inference Compilation and Universal Probabilistic Programming

We introduce a method for using deep neural networks to amortize the cost of inference in models from the family induced by universal probabilistic programming languages, establishing a framework

Efficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model

We present a novel probabilistic programming framework that couples directly to existing large-scale simulators through a cross-platform probabilistic execution protocol, which allows general-purpose

Venture: a higher-order probabilistic programming platform with programmable inference

Stochastic regeneration linear runtime scaling in cases where many previous approaches scaled quadratically is shown, and how to use stochastic regeneration and the SPI to implement general-purpose inference strategies such as Metropolis-Hastings, Gibbs sampling, and blocked proposals based on particle Markov chain Monte Carlo and mean-field variational inference techniques are shown.

A New Approach to Probabilistic Programming Inference

A new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo that supports accurate inference in models that make use of complex control ow, including stochastic recursion is introduced.

Probabilistic programming

This paper describes connections this research area called ``Probabilistic Programming" has with programming languages and software engineering, and this includes language design, and the static and dynamic analysis of programs.

Lightweight Implementations of Probabilistic Programming Languages Via Transformational Compilation

This work describes a general method of transforming arbitrary programming languages into probabilistic programming languages with straightforward MCMC inference engines, and illustrates the technique on Bher, a compiled version of the Church language which eliminates interpretive overhead of the original MIT-Church implementation, and Stochastic Matlab, a new open-source language.

Church: a language for generative models

This work introduces Church, a universal language for describing stochastic generative processes, based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset.

Picture: A probabilistic programming language for scene perception

Picture is presented, a probabilistic programming language for scene understanding that allows researchers to express complex generative vision models, while automatically solving them using fast general-purpose inference machinery.

Amortized Inference in Probabilistic Reasoning

It is argued that the brain oper- ates in the setting of amortized inference, where numerous related queries must be answered (e.g., recognizing a scene from multiple viewpoints); in this setting, memoryless algo- rithms can be computationally wasteful.

An Introduction to Probabilistic Programming

This document starts with a discussion of model-based reasoning and explains why conditioning as a foundational computation is central to the fields of probabilistic machine learning and artificial intelligence, and introduces a simple first-order Probabilistic programming language whose programs define static-computation-graph, finite-variable-cardinality models.