Deriving Probability Density Functions from Probabilistic Functional Programs

@article{Bhat2017DerivingPD,
  title={Deriving Probability Density Functions from Probabilistic Functional Programs},
  author={Sooraj Bhat and Johannes Borgstr{\"o}m and Andrew D. Gordon and Claudio V. Russo},
  journal={ArXiv},
  year={2017},
  volume={abs/1704.00917}
}
The probability density function of a probability distribution is a fundamental concept in probability theory and a key ingredient in various widely used machine learning methods. However, the necessary framework for compiling probabilistic functional programs to density functions has only recently been developed. In this work, we present a density compiler for a probabilistic language with failure and both discrete and continuous distributions, and provide a proof of its soundness. The… Expand
Deriving a probability density calculator (functional pearl)
TLDR
It turns out that a compositional procedure for finding a density can be derived, by equational reasoning about integrals, starting with the mathematical specification of what a density is. Expand
Practical probabilistic programming with monads
TLDR
This work uses a GADT as an underlying representation of a probability distribution and applies Sequential Monte Carlo-based methods to achieve efficient inference, and defines a formal semantics via measure theory. Expand
Exact Symbolic Inference in Probabilistic Programs via Sum-Product Representations
TLDR
This work formalizes SPPL in terms of a novel translation strategy from probabilistic programs to a semantic domain of sum-product representations, and presents new algorithms for exactly conditioning on and computing probabilities of queries, and proves their soundness under the semantics. Expand
A lambda-calculus foundation for universal probabilistic programming
TLDR
This work adapts the classic operational semantics of λ-calculus to a continuous setting via creating a measure space on terms and defining step-indexed approximations, and proves equivalence of big-step and small-step formulations of this distribution-based semantics. Expand
Towards verified stochastic variational inference for probabilistic programs
TLDR
This paper analyses one of the most fundamental and versatile variational inference algorithms, called score estimator or REINFORCE, using tools from denotational semantics and program analysis, and formally expresses what this algorithm does on models denoted by programs, and exposes implicit assumptions made by the algorithm on the models. Expand
Compiling Markov chain Monte Carlo algorithms for probabilistic modeling
TLDR
A compiler is described that transforms a probabilistic model written in a restricted modeling language and a query for posterior samples given observed data into a Markov Chain Monte Carlo (MCMC) inference algorithm that implements the query. Expand
Efficient synthesis of probabilistic programs
TLDR
The idea of ``sketching'' from synthesis of deterministic programs is borrowed, and an efficient Markov Chain Monte Carlo (MCMC) based synthesis algorithm is designed to instantiate the holes in the sketch with program fragments to synthesize a probabilistic program that is most consistent with the data. Expand
Symbolic Bayesian Inference by Lazy Partial Evaluation
Bayesian inference, of posterior knowledge based on prior knowledge and observed evidence, is typically implemented by applying Bayes’s theorem, solving an equation in which the posterior multipliedExpand
Composing Inference Algorithms as Program Transformations
TLDR
This work makes this code generation modular by decomposing inference algorithms into reusable program-to-program transformations that perform exact inference as well as generate probabilistic programs that compute expectations, densities, and MCMC samples. Expand
Symbolic Disintegration with a Variety of Base Measures
TLDR
This work presents the first disintegrator that handles variable base measures, including discrete-continuous mixtures, dependent products, and disjoint sums, and derive the disintegrator and prove it sound by equational reasoning from semantic specifications. Expand
...
1
2
3
...

References

SHOWING 1-10 OF 36 REFERENCES
Deriving Probability Density Functions from Probabilistic Functional Programs
TLDR
This work presents a density compiler for a probabilistic language with discrete and continuous distributions, and discrete observations, and provides a proof of its soundness, which greatly reduces the development effort of domain experts. Expand
A type theory for probability density functions
TLDR
This work formalizes the first probabilistic language that exhibits continuous probability distributions, the ability to naturally express custom probabilism models, and probability density functions (PDFs), and serves as a foundational framework for extending the ideas to more general languages. Expand
A Verified Compiler for Probability Density Functions
TLDR
This work implements an inductive compiler that computes density functions for probability spaces described by programs in a probabilistic functional language within the theorem prover Isabelle and gives a formal proof of its soundness. Expand
Stochastic lambda calculus and monads of probability distributions
TLDR
A translation of stochastic lambda calculus into measure terms is given, which can not only denote discrete probability distributions but can also support the best known modeling techniques. Expand
The Category of Markov Kernels
  • P. Panangaden
  • Computer Science, Mathematics
  • Electron. Notes Theor. Comput. Sci.
  • 1999
TLDR
It is shown that this category based on Markov kernels has partially-additive structure and, as such, supports basic constructs like iteration and allows one to give a probabilistic semantics for a language with while loops in the manner of Kozen. Expand
Embedded Probabilistic Programming
TLDR
This work uses delimited continuations to reify probabilistic programs as lazy search trees, which inference algorithms may traverse without imposing any interpretive overhead on deterministic parts of a model. Expand
Church: a language for generative models
TLDR
This work introduces Church, a universal language for describing stochastic generative processes, based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. Expand
Markov Logic
TLDR
Markov logic accomplishes this by attaching weights to first-order formulas and viewing them as templates for features of Markov networks, and is the basis of the open-source Alchemy system. Expand
Lightweight Implementations of Probabilistic Programming Languages Via Transformational Compilation
TLDR
This work describes a general method of transforming arbitrary programming languages into probabilistic programming languages with straightforward MCMC inference engines, and illustrates the technique on Bher, a compiled version of the Church language which eliminates interpretive overhead of the original MIT-Church implementation, and Stochastic Matlab, a new open-source language. Expand
A Language and Program for Complex Bayesian Modelling
TLDR
This work describes some general purpose software that is currently developing for implementing Gibbs sampling: BUGS (Bayesian inference using Gibbs sampling), written in Modula-2 and runs under both DOS and UNIX. Expand
...
1
2
3
4
...