# Symbolic Bayesian Inference by Lazy Partial Evaluation

@inproceedings{Shan2015SymbolicBI, title={Symbolic Bayesian Inference by Lazy Partial Evaluation}, author={Chung-chieh Shan and Norman Ramsey}, year={2015} }

Bayesian inference, of posterior knowledge based on prior knowledge and observed evidence, is typically implemented by applying Bayes’s theorem, solving an equation in which the posterior multiplied by the probability of an observation equals a joint probability. But when we observe a value of a continuous variable, the observation usually has probability zero, and Bayes’s theorem says only that zero times the unknown is zero. To infer a posterior distribution from a zero-probability…

## 6 Citations

Observation Propagation for Importance Sampling with Likelihood Weighting

- Computer Science
- 2015

This work presents a big-step semantics of importance sampling with likelihood weighting for a core universal probabilistic programming language with observation propagation and explores the interaction between observation and the other computational features of the language.

Building blocks for exact and approximate inference

- Computer Science
- 2015

A handful of inference building blocks are presented that not only constitute black-box inference methods themselves but also generate a search space of inference strategies that includes combinations of exact and approximate inference methods.

All You Need is the Monad . . . What Monad Was That Again ?

- Mathematics
- 2015

Probability enjoys a monadic structure (Lawvere 1962; Giry 1981; Ramsey and Pfeffer 2002). A monadic computation represents a probability distribution, and the unit operation return a creates the…

Semantics for probabilistic programming: higher-order functions, continuous distributions, and soft constraints

- Computer Science2016 31st Annual ACM/IEEE Symposium on Logic in Computer Science (LICS)
- 2016

A metalanguage is defined (an idealised version of Anglican) for probabilistic computation with the above features, both operational and denotational semantics are developed, and soundness, adequacy, and termination are proved.

Composing Inference Algorithms as Program Transformations

- Computer ScienceUAI
- 2017

This work makes this code generation modular by decomposing inference algorithms into reusable program-to-program transformations that perform exact inference as well as generate probabilistic programs that compute expectations, densities, and MCMC samples.

Probabilistic Inference by Program Transformation in Hakaru (System Description)

- Computer ScienceFLOPS
- 2016

We present Hakaru, a new probabilistic programming system that allows composable reuse of distributions, queries, and inference algorithms, all expressed in a single language of measures. The system…

## References

SHOWING 1-10 OF 48 REFERENCES

R2: An Efficient MCMC Sampler for Probabilistic Programs

- Computer ScienceAAAI
- 2014

It is shown that R2 is able to produce results of similar quality as the CHURCH and STAN probabilistic programming tools with much shorter execution time and rigorously prove the correctness of R2.

Stochastic lambda calculus and monads of probability distributions

- Computer SciencePOPL '02
- 2002

A translation of stochastic lambda calculus into measure terms is given, which can not only denote discrete probability distributions but can also support the best known modeling techniques.

Calcul des Probabilités

- PhilosophyNature
- 1889

Abstract“EVERYBODY makes errors in Probabilities at times, and big ones,” writes De Morgan to Sir William Hamilton. M. Bertrand appears to form an exception to this dictum, or at least to its severer…

First-order probabilistic inference

- Philosophy, Computer ScienceIJCAI
- 2003

This paper presents an algorithm to reason about multiple individuals, where the authors may know particular facts about some of them, but want to treat the others as a group.

A Categorical Approach to Probability Theory

- Computer Science, MathematicsStud Logica
- 2010

This work shows that the category ID of D-posets of fuzzy sets and sequentially continuous D-homomorphisms allows to characterize the passage from classical to fuzzy events as the minimal generalization having nontrivial quantum character.

A type theory for probability density functions

- Computer SciencePOPL '12
- 2012

This work formalizes the first probabilistic language that exhibits continuous probability distributions, the ability to naturally express custom probabilism models, and probability density functions (PDFs), and serves as a foundational framework for extending the ideas to more general languages.

The semantic foundations of concurrent constraint programming

- Computer SciencePOPL '91
- 1991

The basic ideas involved in giving a coherent semantic account of concurrent constraint programming are developed, including a simple and general formulation of the notion that a constraint system is a system of partial information.

Éléments de la Théorie des Probabilités

- MathematicsNature
- 1909

LIKE all Prof. Borel's works, this is a very pleasant book to read. It is in three parts, dealing respectively with discontinuous problems, continuous problems, and those in which a priori…

A User's Guide to Measure-Theoretic Probability

- Computer Science
- 2003

The authors’ theory of estimation is based on a geometric approach and seems closely related to ducial intervals as developed in several articles by Neyman, and may be looked upon as a first step in reconciling classical statistics with Bayes statistics.