Learn More
Formal languages for probabilistic modeling enable re-use, modularity, and descriptive clarity, and can foster generic inference techniques. We introduce Church, a universal language for describing stochastic generative processes. Church is based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. The semantics of(More)
We propose a causal Bayesian model of false belief reasoning in children. This model realizes theory of mind as the rational use of intuitive theories and supports causal prediction, explanation, and theory revision. The model undergoes an experience-driven false belief transition. We investigate the relationship between prediction , explanation, and(More)
People have strong intuitions about the influence objects exert upon one another when they collide. Because people's judgments appear to deviate from Newtonian mechanics, psychologists have suggested that people depend on a variety of task-specific heuristics. This leaves open the question of how these heuristics could be chosen, and how to integrate them(More)
The Dirichlet process (DP) is a fundamental mathematical tool for Bayesian non-parametric modeling, and is widely used in tasks such as density estimation, natural language processing, and time series modeling. In most applications, however, the Dirichlet process requires approximate inference to be performed with varia-tional methods or Markov chain Monte(More)
We describe Venture, an interactive virtual machine for probabilistic programming that aims to be sufficiently expressive, extensible, and efficient for general-purpose use. Like Church, probabilistic models and inference problems in Venture are specified via a Turing-complete, higher-order prob-abilistic language descended from Lisp. Unlike Church, Venture(More)
Most natural domains can be represented in multiple ways: animals may be thought of in terms of their tax-onomic groupings or their ecological niches and foods may be thought of in terms of their nutritional content or social role. We present a computational framework that discovers multiple systems of categories given information about a domain of objects(More)
Traditional approaches to Bayes net structure learning typically assume little regularity in graph structure other than sparseness. However, in many cases, we expect more systematicity: variables in real-world systems often group into classes that predict the kinds of probabilistic dependencies they participate in. Here we capture this form of prior(More)
The idea of computer vision as the Bayesian inverse problem to computer graphics has a long history and an appealing elegance, but it has proved difficult to directly implement. Instead, most vision tasks are approached via complex bottom-up processing pipelines. Here we show that it is possible to write short, simple prob-abilistic graphics programs that(More)
Recent progress on probabilistic modeling and statistical learning, coupled with the availability of large training datasets, has led to remarkable progress in computer vision. Generative probabilistic models, or “analysis-by-synthesis” approaches, can capture rich scene structure but have been less widely applied than their discriminative(More)