Merging the senses into a robust percept

@article{Ernst2004MergingTS,
  title={Merging the senses into a robust percept},
  author={Marc O. Ernst and Heinrich H. B{\"u}lthoff},
  journal={Trends in Cognitive Sciences},
  year={2004},
  volume={8},
  pages={162-169}
}
To perceive the external environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. All these different sources of information have to be efficiently merged to form a coherent and robust percept. Here we highlight some of the mechanisms that underlie this merging of the senses in the brain. We show that, depending on the type of information, different combination and integration strategies are used and that… Expand
A Bayesian view on multimodal cue integration
TLDR
This chapter reviews a model that in the statistical sense describes an optimal integration mechanism for integrating sensory information and points out how this integration scheme can be incorporated in a larger framework using Bayesian decision theory (BDT). Expand
Integration of Sensory Information Within Touch and Across Modalities
TLDR
A model (the MLE model) is reviewed that in the statistical sense describes an optimal integration mechanism in the human brain that demonstrates the integration of force and position cues to shape within haptic perception and highlights multimodal perception. Expand
The integrated development of sensory organization.
TLDR
Recent advances in the understanding of the development of sensory integration and organization are explored and implications of these advances for the care and management of the preterm infant are discussed. Expand
Combining visual and auditory information.
TLDR
Recent evidence from laboratories investigating how information from auditory and visual modalities is combined supports the notion of Bayesian combination, and it is shown that although visual and auditory information is combined to maximize efficiency, attentional resources for the two modalities are largely independent. Expand
Cross-modal facilitation of visual and tactile motion
Robust and versatile perception of the world is augmented considerably when information from our five separate sensory systems is combined. Much recent evidence has demonstrated near-optimalExpand
Feature integration across multimodal perception and action: a review.
TLDR
Recent empirical and theoretical developments in addressing the question of how this distributed information is integrated into coherent representations (the so-called binding problem) are discussed with an emphasis on the principles and constraints underlying the integration of multiple features across different sensory modalities and across perception and action planning. Expand
Why Seeing Is Believing: Merging Auditory and Visual Worlds
TLDR
This review discusses recent experiments on audiovisual integration that support the hypothesis that visual information tends to be more reliable than other sources of spatial information, and the central nervous system integrates information in a statistically optimal fashion. Expand
The "puzzle" of sensory perception: putting together multisensory information
TLDR
Recently it was shown that touch can teach the visual modality how to interpret its signals and theirreliabilities, and this suggests that maximum-likelihood-estimation is an effective and widely used strategy exploited by the perceptual system. Expand
Attention controls multisensory perception via 2 distinct mechanisms at different levels of the cortical hierarchy.
TLDR
It is demonstrated that the brain moulds multisensory inference via 2 distinct mechanisms, Prestimulus attention to vision enhances the reliability and influence of visual inputs on spatial representations in visual and posterior parietal cortices and distinct neural mechanisms control how signals are combined for perceptual inference at different levels of the cortical hierarchy. Expand
Multisensory integration of redundant and complementary cues
TLDR
The present thesis assesses the effects of cue properties on multisensory processing and reports a series of experiments demonstrating that the nature of the cue, defined by the task of the observer, influences whether the cues compete for representation as a result of interacting, or whether instead mult isensory information produces an optimal increase in reliability of the event estimate. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 71 REFERENCES
Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses
TLDR
It is reported that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are Combined. Expand
Multisensory integration, perception and ecological validity
TLDR
Experimental results generalize to real life only when they reflect automatic perceptual processes, and not response strategies adopted to satisfy the particular demands of laboratory tasks. Expand
Bayesian models of object perception
TLDR
Advances in Bayesian models of computer vision and in the measurement and modeling of natural image statistics are providing the tools to test and constrain theories of human object perception, which are having an impact on the interpretation of cortical function. Expand
Illusions: What you see is what you hear
TLDR
It is shown that auditory information can qualitatively alter the perception of an unambiguous visual stimulus to create a striking visual illusion, indicating that visual perception can be manipulated by other sensory modalities. Expand
Multisensory Integration: Maintaining the Perception of Synchrony
TLDR
A 'moveable window' for multisensory integration and by a 'temporal ventriloquism' effect are suggested to explain differences in the arrival time of inputs to each of the senses. Expand
Vision and Touch: An Experimentally Created Conflict between the Two Senses
TLDR
The results reveal that vision is strongly dominant, often without the observer's being aware of a conflict. Expand
Shape-from-X: psychophysics and computation
The Bayesian approach to vision provides a fruitful theoretical framework for integrating different depth modules. In this formulation depth can be represented by one or more surfaces. PriorExpand
Touch can change visual slant perception
TLDR
It is reported that haptic feedback (active touch) increases the weight of a consistent surface-slant signal relative to inconsistent signals, and appearance of a subsequently viewed surface is changed: the surface appears slanted in the direction specified by the haptically reinforced signal. Expand
Using Bayes' Rule to Model Multisensory Enhancement in the Superior Colliculus
TLDR
These neurophysiological findings support the hypothesis that deep SC neurons use their sensory inputs to compute the probability that a target is present and suggests that inverse effectiveness results because the increase in target probability due to the integration of multisensory inputs is larger when the unimodal responses are weaker. Expand
Integration of proprioceptive and visual position-information: An experimentally supported model.
TLDR
The proposed model can explain the unexpectedly small sizes of the variable errors in the localization of a seen hand that were reported earlier and implies that the CNS has knowledge about the direction-dependent precision of the proprioceptive and visual information. Expand
...
1
2
3
4
5
...