• Corpus ID: 204800956

Amortized Rejection Sampling in Universal Probabilistic Programming

  title={Amortized Rejection Sampling in Universal Probabilistic Programming},
  author={Saeid Naderiparizi and A. Scibior and Andreas Munk and Mehrdad Ghadiri and Atilim Gunes Baydin and Bradley Gram-Hansen and C. S. D. Witt and Robert Zinkov and Philip H. S. Torr and Tom Rainforth and Yee Whye Teh and Frank D. Wood},
  booktitle={International Conference on Artificial Intelligence and Statistics},
Existing approaches to amortized inference in probabilistic programs with unbounded loops can produce estimators with infinite variance. An instance of this is importance sampling inference in programs that explicitly include rejection sampling as part of the user-programmed generative procedure. In this paper we develop a new and efficient amortized importance sampling estimator. We prove finite variance of our estimator and empirically demonstrate our method's correctness and efficiency… 

Figures from this paper

Recursive Monte Carlo and Variational Inference with Auxiliary Variables

RAVI generalizes and generalizes several existing methods for inference with expressive approximating families, which it shows correspond to specific choices of meta-inference algorithm, and provides new theory for analyzing their bias and variance.

Planning as Inference in Epidemiological Dynamics Models

This work demonstrates the use of a probabilistic programming language that automates inference in existing simulators and shows how such simulation-based models and inference automation tools applied in support of policy-making could lead to less economically damaging policy prescriptions, particularly during the current COVID-19 pandemic.

Simulation-Based Inference for Global Health Decisions

Recent breakthroughs in machine learning, specifically in simulation-based inference, are discussed, and its potential as a novel venue for model calibration to support the design and evaluation of public health interventions is explored.

Imagining The Road Ahead: Multi-Agent Trajectory Prediction via Differentiable Simulation

A deep generative model built on a fully differentiable simulator for multi-agent trajectory prediction, producing realistic multi-modal predictions without any ad-hoc diversity-inducing losses, named ITRA, for “Imagining the Road Ahead”.



Inference Compilation and Universal Probabilistic Programming

We introduce a method for using deep neural networks to amortize the cost of inference in models from the family induced by universal probabilistic programming languages, establishing a framework

A New Approach to Probabilistic Programming Inference

A new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo that supports accurate inference in models that make use of complex control ow, including stochastic recursion is introduced.

Etalumis: bringing probabilistic programming to scientific simulators at scale

A novel PPL framework that couples directly to existing scientific simulators through a cross-platform probabilistic execution protocol and provides Markov chain Monte Carlo (MCMC) and deep-learning-based inference compilation (IC) engines for tractable inference is presented.

Automated Variational Inference in Probabilistic Programming

A new algorithm for approximate inference in probabilistic programs, based on a stochastic gradient for variational programs, is presented, which is efficient without restrictions on the probabilists and improves inference efficiency over other algorithms.

Venture: a higher-order probabilistic programming platform with programmable inference

Stochastic regeneration linear runtime scaling in cases where many previous approaches scaled quadratically is shown, and how to use stochastic regeneration and the SPI to implement general-purpose inference strategies such as Metropolis-Hastings, Gibbs sampling, and blocked proposals based on particle Markov chain Monte Carlo and mean-field variational inference techniques are shown.

Efficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model

We present a novel probabilistic programming framework that couples directly to existing large-scale simulators through a cross-platform probabilistic execution protocol, which allows general-purpose

Denotational validation of higher-order Bayesian inference

A modular semantic account of Bayesian inference algorithms for probabilistic programming languages, as used in data science and machine learning, is presented and Kock's synthetic measure theory is used to emphasize the connection between the semantic manipulation and its traditional measure theoretic origins.

Church: a language for generative models

This work introduces Church, a universal language for describing stochastic generative processes, based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset.

Amortized Inference in Probabilistic Reasoning

It is argued that the brain oper- ates in the setting of amortized inference, where numerous related queries must be answered (e.g., recognizing a scene from multiple viewpoints); in this setting, memoryless algo- rithms can be computationally wasteful.

Lightweight Implementations of Probabilistic Programming Languages Via Transformational Compilation

This work describes a general method of transforming arbitrary programming languages into probabilistic programming languages with straightforward MCMC inference engines, and illustrates the technique on Bher, a compiled version of the Church language which eliminates interpretive overhead of the original MIT-Church implementation, and Stochastic Matlab, a new open-source language.