Reasoning about reasoning by nested conditioning: Modeling theory of mind with probabilistic programs

@article{Stuhlmller2014ReasoningAR,
  title={Reasoning about reasoning by nested conditioning: Modeling theory of mind with probabilistic programs},
  author={Andreas Stuhlm{\"u}ller and Noah D. Goodman},
  journal={Cognitive Systems Research},
  year={2014},
  volume={28},
  pages={80-99}
}
A wide range of human reasoning patterns can be explained as conditioning in probabilistic models; however, conditioning has traditionally been viewed as an operation applied to such models, not represented in such models. [...] Key Method Much of human reasoning is about the beliefs, desires, and intentions of other people; we use probabilistic programs to formalize these inferences in a way that captures the flexibility and inherent uncertainty of reasoning about other agents.Expand
Modeling Cognition with Probabilistic Programs: Representations and Algorithms
This thesis develops probabilistic programming as a productive metaphor for understanding cognition, both with respect to mental representations and the manipulation of such representations. In theExpand
Nested Reasoning About Autonomous Agents Using Probabilistic Programs
TLDR
This work develops a planning-as-inference framework in which agents perform nested simulation to reason about the behavior of other agents in an online manner, and uses probabilistic programs to model a high-uncertainty variant of pursuit-evasion games. Expand
Modeling Theory of Mind for Autonomous Agents with Probabilistic Programs
TLDR
This paper demonstrates that using a planning-as-inference formulation based on nested importance sampling results in agents simultaneously reasoning about other agents' plans and crafting counter-plans, and that probabilistic programming is a natural way to describe models in which each uses complex primitives such as path planners to make decisions. Expand
On the Consistency of Approximate Multi-agent Probability Theory
  • M. W. Madsen
  • Computer Science
  • KI - Künstliche Intelligenz
  • 2015
TLDR
This paper translates Andreas Stuhlmüller and Noah Goodman’s proposal into a more conventional probabilistic language, comparing it to an alternative system which models subjective probabilities as random variables. Expand
Towards common-sense reasoning via conditional simulation: legacies of Turing in Artificial Intelligence
TLDR
This work describes a computational formalism centered around a probabilistic Turing machine called QUERY, which captures the operation of Probabilistic conditioning via conditional simulation and demonstrates how the QUERY abstraction can be used to cast common-sense reasoning as probabilism inference in a statistical model of observations and the uncertain structure of the world that generated that experience. Expand
Probabilistic Programming for Theory of Mind for Autonomous Decision Making
TLDR
A nested self-normalized importance sampling inference algorithm is proposed for probabilistic programs, and it is demonstrated that it can be used with planning-as-inference to simultaneously reason about other agents’ plans and craft counter-plans, and that nested modeling manifests a wide variety of rational agent behavior. Expand
Probabilistic Semantics and Pragmatics: Uncertainty in Language and Thought
TLDR
This chapter synthesizes several of these modeling advances, exploring a formal model of interpretation grounded, via lexical semantics and pragmatic inference, in conceptual structure, which takes on a Boolean value for each imagined world. Expand
Nesting Probabilistic Programs
TLDR
This work formalizes the notion of nesting probabilistic programming queries and introduces a new online nested Monte Carlo estimator that makes it substantially easier to ensure conditions required for convergence are met, thereby providing a simple framework for designing statistically correct inference engines. Expand
On the nature and origin of intuitive theories : learning, physics and psychology
This thesis develops formal computational models of intuitive theories, in particular intuitive physics and intuitive psychology, which form the basis of commonsense reasoning. The overarching formalExpand
Modeling Human Plan Recognition Using Bayesian Theory of Mind
The human brain is the most powerful plan-recognition system we know. Central to the brain’s remarkable plan-recognition capacity is a theory of mind (ToM): our intuitive conception of other agents’Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 45 REFERENCES
IBAL: A Probabilistic Rational Programming Language
TLDR
A detailed account of the syntax and semantics of IBAL, a rational programming language for probabilistic and decision-theoretic agents, as well as an overview of the implementation are presented. Expand
Bayesian models of human action understanding
TLDR
A Bayesian framework is presented for explaining how people reason about and predict the actions of an intentional agent, based on observing its behavior, and how this model can be used to infer the goal of an agent and predict how the agent will act in novel situations or when environmental constraints change. Expand
Church: a language for generative models
TLDR
This work introduces Church, a universal language for describing stochastic generative processes, based on the Lisp model of lambda calculus, containing a pure Lisp as its deterministic subset. Expand
How to Grow a Mind: Statistics, Structure, and Abstraction
TLDR
This review describes recent approaches to reverse-engineering human learning and cognitive development and, in parallel, engineering more humanlike machine learning systems. Expand
Bayesian Theory of Mind: Modeling Joint Belief-Desire Attribution
TLDR
This work presents a computational framework for understanding The- ory of Mind (ToM): the human capacity for reasoning about agents’ mental states such as beliefs and desires, and expresses the predictive model of belief- and desire-dependent action at the heart of ToM as a partially observable Markov decision process (POMDP), and reconstructs an agent’s joint belief state and reward state using Bayesian inference. Expand
Effective Bayesian Inference for Stochastic Programs
TLDR
This paper proposes a stochastic version of a general purpose functional programming language that contains random choices, conditional statements, structured values, defined functions, and recursion, and provides an exact algorithm for computing conditional probabilities of the form Pr(P(x) | Q(x)) where x is chosen randomly from this distribution. Expand
Predicting Pragmatic Reasoning in Language Games
TLDR
This model provides a close, parameter-free fit to human judgments, suggesting that the use of information-theoretic tools to predict pragmatic reasoning may lead to more effective formal models of communication. Expand
Probabilistic inference for solving (PO) MDPs
TLDR
The approach is based on an equivalence between maximization of the expected future return in the time-unlimited MDP and likelihood maximization in a related mixture of finite-time MDPs, which allows to use expectation maximization (EM) for computing optimal policies, using arbitrary inference techniques in the E-step. Expand
A Dynamic Programming Algorithm for Inference in Recursive Probabilistic Programs
TLDR
A dynamic programming algorithm that takes a functional interpreter for an arbitrary probabilistic programming language and turns it into an efficient marginalizer, and builds a graph of dependencies between sub-distributions that corresponds to a system of equations for the marginal distribution. Expand
Intuitive Theories of Mind: A Rational Approach to False Belief
We propose a rational analysis of children’s false belief reasoning. Our analysis realizes a continuous, evidencedriven transition between two causal Bayesian models of false belief. Both modelsExpand
...
1
2
3
4
5
...