• Publications
  • Influence
Church: a language for generative models
TLDR
This work introduces Church, a universal language for describing stochastic generative processes, based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset.
A New Approach to Probabilistic Programming Inference
TLDR
A new approach to inference in expressive probabilistic programming languages based on particle Markov chain Monte Carlo that supports accurate inference in models that make use of complex control ow, including stochastic recursion is introduced.
Venture: a higher-order probabilistic programming platform with programmable inference
TLDR
Stochastic regeneration linear runtime scaling in cases where many previous approaches scaled quadratically is shown, and how to use stochastic regeneration and the SPI to implement general-purpose inference strategies such as Metropolis-Hastings, Gibbs sampling, and blocked proposals based on particle Markov chain Monte Carlo and mean-field variational inference techniques are shown.
Reconciling intuitive physics and Newtonian mechanics for colliding objects.
TLDR
A range of effects in mass judgments that have been taken as strong evidence for heuristic use are investigated and show that they are well explained by the interplay between Newtonian constraints and sensory uncertainty.
Gen: a general-purpose probabilistic programming system with programmable inference
TLDR
It is shown that Gen outperforms state-of-the-art probabilistic programming systems, sometimes by multiple orders of magnitude, on diverse problems including object tracking, estimating 3D body pose from a depth image, and inferring the structure of a time series.
Intuitive Theories of Mind: A Rational Approach to False Belief
We propose a rational analysis of children’s false belief reasoning. Our analysis realizes a continuous, evidencedriven transition between two causal Bayesian models of false belief. Both models
Structured Priors for Structure Learning
TLDR
This work presents a nonparametric generative model for directed acyclic graphs as a prior for Bayes net structure learning that assumes that variables come in one or more classes and that the prior probability of an edge existing between two variables is a function only of their classes.
BayesDB: A probabilistic programming system for querying the probable implications of data
TLDR
This paper describes BayesDB, a probabilistic programming platform that aims to enable users to query the probable implications of their data as directly as SQL databases enable them to queries the data itself.
Picture: A probabilistic programming language for scene perception
TLDR
Picture is presented, a probabilistic programming language for scene understanding that allows researchers to express complex generative vision models, while automatically solving them using fast general-purpose inference machinery.
Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs
TLDR
It is shown that it is possible to write short, simple probabilistic graphics programs that define flexible generative models and to automatically invert them to interpret real-world images, and yields accurate, approximately Bayesian inferences about real- world images.
...
...