Corpus ID: 16754821

Symbolic Bayesian Inference by Lazy Partial Evaluation

@inproceedings{Shan2015SymbolicBI,
  title={Symbolic Bayesian Inference by Lazy Partial Evaluation},
  author={Chung-chieh Shan and N. Ramsey},
  year={2015}
}
Bayesian inference, of posterior knowledge based on prior knowledge and observed evidence, is typically implemented by applying Bayes’s theorem, solving an equation in which the posterior multiplied by the probability of an observation equals a joint probability. But when we observe a value of a continuous variable, the observation usually has probability zero, and Bayes’s theorem says only that zero times the unknown is zero. To infer a posterior distribution from a zero-probability… Expand

Figures from this paper

Building blocks for exact and approximate inference
All You Need is the Monad . . . What Monad Was That Again ?
Semantics for probabilistic programming: higher-order functions, continuous distributions, and soft constraints
Composing Inference Algorithms as Program Transformations

References

SHOWING 1-10 OF 49 REFERENCES
R2: An Efficient MCMC Sampler for Probabilistic Programs
Stochastic lambda calculus and monads of probability distributions
Calcul des Probabilités
First-order probabilistic inference
  • D. Poole
  • Mathematics, Computer Science
  • IJCAI
  • 2003
Proofs of randomized algorithms in Coq
Deriving Probability Density Functions from Probabilistic Functional Programs
A Categorical Approach to Probability Theory
A type theory for probability density functions
Stochastic processes as concurrent constraint programs
...
1
2
3
4
5
...