• Corpus ID: 1617294

Church: a language for generative models

@article{Goodman2008ChurchAL,
  title={Church: a language for generative models},
  author={Noah D. Goodman and Vikash K. Mansinghka and Daniel M. Roy and Keith Bonawitz and Joshua B. Tenenbaum},
  journal={ArXiv},
  year={2008},
  volume={abs/1206.3255}
}
Formal languages for probabilistic modeling enable re-use, modularity, and descriptive clarity, and can foster generic inference techniques. We introduce Church, a universal language for describing stochastic generative processes. Church is based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. The semantics of Church is defined in terms of evaluation histories and conditional distributions on such histories. Church also includes a novel language… 

Figures from this paper

Reduced Traces and JITing in Church
TLDR
It is argued that an extremely general language can still support very fast inference, including many implementation tradeoffs, and the theoretical aspects of the Church-like language are introduced, which allows for more detailed specification of the inference.
Venture: a higher-order probabilistic programming platform with programmable inference
TLDR
Stochastic regeneration linear runtime scaling in cases where many previous approaches scaled quadratically is shown, and how to use stochastic regeneration and the SPI to implement general-purpose inference strategies such as Metropolis-Hastings, Gibbs sampling, and blocked proposals based on particle Markov chain Monte Carlo and mean-field variational inference techniques are shown.
A domain theory for statistical probabilistic programming
TLDR
Quasi-Borel predomains form both a model of Fiore's axiomatic domain theory and a models of Kock's synthetic measure theory, which gives an adequate denotational semantics for languages with recursive higher-order types, continuous probability distributions, and soft constraints.
Trace types and denotational semantics for sound programmable inference in probabilistic languages
TLDR
This work presents a denotational semantics for programmable inference in higher-order probabilistic programming languages, along with a type system that ensures that well-typed inference programs are sound by construction.
A lambda-calculus foundation for universal probabilistic programming
TLDR
This work adapts the classic operational semantics of λ-calculus to a continuous setting via creating a measure space on terms and defining step-indexed approximations, and proves equivalence of big-step and small-step formulations of this distribution-based semantics.
Inducing Probabilistic Programs by Bayesian Program Merging
TLDR
This report outlines an approach to learning generative models from data that expresses models as probabilistic programs, which allows them to capture abstract patterns within the examples, and considers two types of transformation: abstraction and Deargumentation, which simplifies functions by reducing the number of arguments.
Automated learning with a probabilistic programming language: Birch
Divide, Conquer, and Combine: a New Inference Strategy for Probabilistic Programs with Stochastic Support
TLDR
This work introduces a new inference framework: Divide, Conquer, and Combine, which remains efficient for models where the support varies between executions, and shows how it can be implemented as an automated and generic PPS inference engine.
Lightweight Implementations of Probabilistic Programming Languages Via Transformational Compilation
TLDR
This work describes a general method of transforming arbitrary programming languages into probabilistic programming languages with straightforward MCMC inference engines, and illustrates the technique on Bher, a compiled version of the Church language which eliminates interpretive overhead of the original MIT-Church implementation, and Stochastic Matlab, a new open-source language.
Formal verification of higher-order probabilistic programs: reasoning about approximation, convergence, Bayesian inference, and optimization
TLDR
A suite of logics, collectively named PPV for proving properties of programs written in an expressive probabilistic higher-order language with continuous sampling operations and primitives for conditioning distributions, and shows expressiveness by giving sound embeddings of existing logics.
...
...

References

SHOWING 1-10 OF 28 REFERENCES
Random-World Semantics and Syntactic Independence for Expressive Languages
TLDR
This work proposes a syntactic independence criterion that holds for a broad class of highly expressive logics under random-world semantics and explores various examples including Bayesian networks, probabilistic context-free grammars, and an example from Mendelian genetics.
Adaptor Grammars: A Framework for Specifying Compositional Nonparametric Bayesian Models
TLDR
This paper presents a general-purpose inference algorithm for adaptor grammars, making it easy to define and use such models, and illustrates how several existing nonparametric Bayesian models can be expressed within this framework.
WinBUGS - A Bayesian modelling framework: Concepts, structure, and extensibility
TLDR
How and why various modern computing concepts, such as object-orientation and run-time linking, feature in the software's design are discussed and how the framework may be extended.
IBAL: A Probabilistic Rational Programming Language
TLDR
A detailed account of the syntax and semantics of IBAL, a rational programming language for probabilistic and decision-theoretic agents, as well as an overview of the implementation are presented.
Markov logic networks
TLDR
Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach to combining first-order logic and probabilistic graphical models in a single representation.
PRISM: A Language for Symbolic-Statistical Modeling
TLDR
It is shown by examples, together with learning results, that most popular probabilistic modeling formalisms, the hidden Markov model and Bayesian networks, are described by PRISM programs.
Stochastic Logic Programs
TLDR
Stochastic logic programs are introduced as a means of providing a structured deenition of such a probability distribution and it is shown that the probabilities can be computed directly for fail-free logic programs and by normalisation for arbitrary logic programs.
Probabilistic models with unknown objects
TLDR
This thesis introduces Bayesian logic (BLOG), a first-order probabilistic modeling language that specifies probability distributions over possible worlds with varying sets of objects, and defines a general framework for inference on BLOG models using Markov chain Monte Carlo algorithms.
Report on the probabilistic language scheme
TLDR
Probabilistic Scheme is presented, an embedding of probabilistic computation into Scheme that gives programmers an expressive language for implementing modular Probabilistic models that integrate naturally with the rest of Scheme.
The Infinite PCFG Using Hierarchical Dirichlet Processes
TLDR
This work presents a nonparametric Bayesian model of tree structures based on the hierarchical Dirichlet process (HDP) and develops an efficient variational inference procedure that can be applied to full-scale parsing applications.
...
...