• Corpus ID: 16754821

Symbolic Bayesian Inference by Lazy Partial Evaluation

  title={Symbolic Bayesian Inference by Lazy Partial Evaluation},
  author={Chung-chieh Shan and Norman Ramsey},
Bayesian inference, of posterior knowledge based on prior knowledge and observed evidence, is typically implemented by applying Bayes’s theorem, solving an equation in which the posterior multiplied by the probability of an observation equals a joint probability. But when we observe a value of a continuous variable, the observation usually has probability zero, and Bayes’s theorem says only that zero times the unknown is zero. To infer a posterior distribution from a zero-probability… 

Figures from this paper

Observation Propagation for Importance Sampling with Likelihood Weighting
This work presents a big-step semantics of importance sampling with likelihood weighting for a core universal probabilistic programming language with observation propagation and explores the interaction between observation and the other computational features of the language.
Building blocks for exact and approximate inference
A handful of inference building blocks are presented that not only constitute black-box inference methods themselves but also generate a search space of inference strategies that includes combinations of exact and approximate inference methods.
All You Need is the Monad . . . What Monad Was That Again ?
Probability enjoys a monadic structure (Lawvere 1962; Giry 1981; Ramsey and Pfeffer 2002). A monadic computation represents a probability distribution, and the unit operation return a creates the
Semantics for probabilistic programming: higher-order functions, continuous distributions, and soft constraints
A metalanguage is defined (an idealised version of Anglican) for probabilistic computation with the above features, both operational and denotational semantics are developed, and soundness, adequacy, and termination are proved.
Composing Inference Algorithms as Program Transformations
This work makes this code generation modular by decomposing inference algorithms into reusable program-to-program transformations that perform exact inference as well as generate probabilistic programs that compute expectations, densities, and MCMC samples.
Probabilistic Inference by Program Transformation in Hakaru (System Description)
We present Hakaru, a new probabilistic programming system that allows composable reuse of distributions, queries, and inference algorithms, all expressed in a single language of measures. The system


R2: An Efficient MCMC Sampler for Probabilistic Programs
It is shown that R2 is able to produce results of similar quality as the CHURCH and STAN probabilistic programming tools with much shorter execution time and rigorously prove the correctness of R2.
Stochastic lambda calculus and monads of probability distributions
A translation of stochastic lambda calculus into measure terms is given, which can not only denote discrete probability distributions but can also support the best known modeling techniques.
Calcul des Probabilités
  • F. e.
  • Philosophy
  • 1889
Abstract“EVERYBODY makes errors in Probabilities at times, and big ones,” writes De Morgan to Sir William Hamilton. M. Bertrand appears to form an exception to this dictum, or at least to its severer
First-order probabilistic inference
  • D. Poole
  • Philosophy, Computer Science
  • 2003
This paper presents an algorithm to reason about multiple individuals, where the authors may know particular facts about some of them, but want to treat the others as a group.
Proofs of randomized algorithms in Coq
A Categorical Approach to Probability Theory
This work shows that the category ID of D-posets of fuzzy sets and sequentially continuous D-homomorphisms allows to characterize the passage from classical to fuzzy events as the minimal generalization having nontrivial quantum character.
A type theory for probability density functions
This work formalizes the first probabilistic language that exhibits continuous probability distributions, the ability to naturally express custom probabilism models, and probability density functions (PDFs), and serves as a foundational framework for extending the ideas to more general languages.
The semantic foundations of concurrent constraint programming
The basic ideas involved in giving a coherent semantic account of concurrent constraint programming are developed, including a simple and general formulation of the notion that a constraint system is a system of partial information.
Éléments de la Théorie des Probabilités
LIKE all Prof. Borel's works, this is a very pleasant book to read. It is in three parts, dealing respectively with discontinuous problems, continuous problems, and those in which a priori
A User's Guide to Measure-Theoretic Probability
The authors’ theory of estimation is based on a geometric approach and seems closely related to Ž ducial intervals as developed in several articles by Neyman, and may be looked upon as a first step in reconciling classical statistics with Bayes statistics.