Bayesian comparison of explicit and implicit causal inference strategies in multisensory heading perception

  title={Bayesian comparison of explicit and implicit causal inference strategies in multisensory heading perception},
  author={Luigi Acerbi and Kalpana Dokka and Dora E. Angelaki and Wei ji Ma},
  journal={PLoS Computational Biology},
The precision of multisensory heading perception improves when visual and vestibular cues arising from the same cause, namely motion of the observer through a stationary environment, are integrated. Thus, in order to determine how the cues should be processed, the brain must infer the causal relationship underlying the multisensory cues. In heading perception, however, it is unclear whether observers follow the Bayesian strategy, a simpler non-Bayesian heuristic, or even perform causal… 

Figures and Tables from this paper

Causal inference for spatial constancy across whole-body motion

The results suggest that the brain implicitly represents the posterior probability that the internally updated estimate and the visual feedback come from a common cause and uses this probability to weigh the two sources of information in mediating spatial constancy across whole body motion.

Causal inference accounts for heading perception in the presence of object motion

It is demonstrated that perception of object motion systematically influences heading judgments, and the findings suggest that the brain interprets object motion and self-motion using a causal inference framework.

Causal Inference in the Perception of Verticality

It is concluded that the notion of a vector sum does not provide a comprehensive explanation of the perception of the upright, and that CI offers a better alternative.

How multisensory neurons solve causal inference

This work trains a neural network to solve causal inference by either combining or separating visual and vestibular inputs in order to estimate self- and scene motion, and finds that the network recapitulates key neurophysiological and behavioral properties of biological systems.

Cognitive, Systems, and Computational Neurosciences of the Self in Motion.

The crux of this review focuses on the human and theoretical approaches that have outlined a normative account of cue combination in behavior and neurons, as well as on the systems neuroscience efforts that are searching for its neural implementation.

Neural dynamics of causal inference in the macaque frontoparietal circuit

A dynamic loop of frontal-parietal interactions in the causal inference framework may provide the neural mechanism to answer long-standing questions regarding how neural circuits represent hidden structures for body-awareness and agency.

Multisensory correlation computations in the human brain identified by a time-resolved encoding model

Neural mechanisms that arbitrate between integrating and segregating multisensory information are essential for complex scene analysis and for the resolution of the multisensory correspondence

Bayesian causal inference in visuotactile integration in children and adults

The results suggest that already from a young age the brain implicitly infers the probability that a tactile and a visual cue share the same cause and uses this probability as a weighting factor in visuotactile localization.

Comparing Bayesian and non-Bayesian accounts of human confidence reports

It is found that subjects do take sensory uncertainty into account when reporting confidence, suggesting that brain areas involved in reporting confidence can access low-level representations of sensory uncertainty, a prerequisite of Bayesian inference.



Cortical Hierarchies Perform Bayesian Causal Inference in Multisensory Perception

Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex and unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition.

Multisensory Oddity Detection as Bayesian Inference

The successful application of structure inference models to the new ‘oddity detection’ paradigm, and the resultant unified explanation of across and within modality cases provide further evidence to suggest that structure inference may be a commonly evolved principle for combining perceptual information in the brain.

Sensory reliability shapes perceptual inference via two mechanisms.

This psychophysics study presented participants with spatially congruent and discrepant audiovisual signals at four levels of visual reliability and demonstrated that Bayesian CI is fundamental for integrating signals of variable reliabilities.

Causal Inference in Multisensory Perception

An ideal-observer model is formulated that infers whether two sensory cues originate from the same location and that also estimates their location(s) and accurately predicts the nonlinear integration of cues by human subjects in two auditory-visual localization tasks.

Causal Inference in Multisensory Heading Estimation

The results support the hypothesis that judgments of signal causality are included in the heading estimation process and suggest a decreasing tolerance for discrepancies and an increasing reliance on visual cues for longer duration motions.

A Bayesian observer model constrained by efficient coding can explain 'anti-Bayesian' percepts

A new model formulation based on efficient coding that is fully specified for any given natural stimulus distribution is proposed that predicts that perception is often biased away from an observer's prior beliefs and that stimulus uncertainty differentially affects perceptual bias depending on whether the uncertainty is induced by internal or external noise.

On the Origins of Suboptimality in Human Probabilistic Inference

This work probes the sources of suboptimality in probabilistic inference using a novel estimation task in which subjects are exposed to an explicitly provided distribution, thereby removing the need to remember the prior, and rejects several models of stochastic behavior, including probability matching and sample-averaging strategies.

Learning to integrate contradictory multisensory self-motion cue pairings.

The results show that human subjects combine and optimally integrate vestibular and visual information, each signaling self-motion around a different rotation axis, suggesting that the experience of two temporally co-occurring but spatially unrelated self- Motion cues leads to inferring a common cause for these two initially unrelated sources of information about self- motion.

Computational Characterization of Visually Induced Auditory Spatial Adaptation

This study quantitatively characterize the change in auditory spatial perception induced by repeated auditory–visual spatial conflict, known as the ventriloquist aftereffect and finds that the shift in the perceived locations after exposure was associated with ashift in the mean of the auditory likelihood functions in the direction of the experienced visual offset.

What’s Up: an assessment of Causal Inference in the Perception of Verticality

The notion of a vector sum does not provide a comprehensive explanation of the perception of the upright, and a novel Alternative-Reality system to manipulate visual and physical tilt independently is developed.