# Symbolic Bayesian Inference by Lazy Partial Evaluation

@inproceedings{Shan2015SymbolicBI, title={Symbolic Bayesian Inference by Lazy Partial Evaluation}, author={Chung-chieh Shan and Norman Ramsey}, year={2015} }

Bayesian inference, of posterior knowledge based on prior knowledge and observed evidence, is typically implemented by applying Bayes’s theorem, solving an equation in which the posterior multiplied by the probability of an observation equals a joint probability. But when we observe a value of a continuous variable, the observation usually has probability zero, and Bayes’s theorem says only that zero times the unknown is zero. To infer a posterior distribution from a zero-probability… CONTINUE READING

#### Citations

##### Publications citing this paper.

SHOWING 1-6 OF 6 CITATIONS

## Observation Propagation for Importance Sampling with Likelihood Weighting

VIEW 1 EXCERPT

CITES METHODS

## Building blocks for exact and approximate inference

VIEW 3 EXCERPTS

CITES BACKGROUND

## All You Need is the Monad . . . What Monad Was That Again ?

VIEW 2 EXCERPTS

CITES BACKGROUND

## Semantics for probabilistic programming: higher-order functions, continuous distributions, and soft constraints

VIEW 1 EXCERPT

CITES BACKGROUND

## Probabilistic Inference by Program Transformation in Hakaru (System Description)

VIEW 1 EXCERPT

CITES METHODS

#### References

##### Publications referenced by this paper.

SHOWING 1-10 OF 46 REFERENCES

## Measure Transformer Semantics for Bayesian Machine Learning

VIEW 2 EXCERPTS

## R2: An Efficient MCMC Sampler for Probabilistic Programs

VIEW 2 EXCERPTS

## Conditioning as disintegration

VIEW 1 EXCERPT

## A type theory for probability density functions

VIEW 1 EXCERPT