Humans integrate visual and haptic information in a statistically optimal fashion

@article{Ernst2002HumansIV,
  title={Humans integrate visual and haptic information in a statistically optimal fashion},
  author={Marc O. Ernst and Martin S. Banks},
  journal={Nature},
  year={2002},
  volume={415},
  pages={429-433}
}
When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual–haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose that a general principle, which minimizes variance in the final estimate, determines the degree to which vision or haptics dominates. This… Expand
Viewing Geometry Determines How Vision and Haptics Combine in Size Perception
TLDR
Combined size estimates are finer than is possible with either vision or haptics alone; indeed, they approach statistical optimality. Expand
The integration of vision and haptic sensing: a computational and neural perspective
TLDR
It is concluded that much progress has been made to provide a computational framework that can formalize and explain the results of behavioral and psychophysical studies on visuo-haptic integration, yet there still exists a gap between the computationally driven studies and the results derived from brain imaging studies. Expand
Optimal integration of shape information from vision and touch
TLDR
It is suggested that observers integrate visual and haptic shape information of real 3D objects in a statistically optimal fashion, and knowledge that multisensory signals arise from the same object seems to promote integration. Expand
Combination and Integration in the Perception of Visual-Haptic Compliance Information
TLDR
The compliance of a material can be conveyed through mechanical interactions in a virtual environment and perceived through both visual and haptic cues and supported by an integration process that constitutes a weighted summation of two random variables, defined by the single modality estimates. Expand
Optimal visual–haptic integration with articulated tools
TLDR
In conclusion, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account, and highly flexible multisensory integration underlying tool use is revealed, consistent with the brain constructing internal models of tools’ properties. Expand
Integration of haptic and visual size cues in perception and action revealed through cross-modal conflict
TLDR
Although haptics represents a less certain source of information, haptic processing follows similar principles to vision and its contribution to perception and action becomes evident only when cross-modal information is incongruent. Expand
Integration of vision and haptics during tool use.
TLDR
The brain appears to combine visual and haptic information, not based on the spatial proximity of sensory stimuli, butbased on the proximity of the distal causes of stimuli, taking into account the dynamics and geometry of tools. Expand
Young Children Do Not Integrate Visual and Haptic Form Information
TLDR
It is suggested that during development, perceptual systems require constant recalibration, for which cross-sensory comparison is important, and using one sense to calibrate the other precludes useful combination of the two sources. Expand
Visual-haptic cue integration with spatial and temporal disparity during pointing movements
TLDR
This work studied the combination of visual and haptic information in the context of human arm movement control by implementing a pointing task and measuring pointing accuracy as function of haptic and visual cue onset and compared pointing performance to the predictions of a multisensory decision model. Expand
Visual-Haptic Size Estimation in Peripersonal Space
TLDR
This work reports on an experiment in which participants compared the size of a visual sphere to a haptic sphere, belonging to the same object in a VE, and discusses the current findings in the framework of adaptation level theory for haptic size reference. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 68 REFERENCES
Touch can change visual slant perception
TLDR
It is reported that haptic feedback (active touch) increases the weight of a consistent surface-slant signal relative to inconsistent signals, and appearance of a subsequently viewed surface is changed: the surface appears slanted in the direction specified by the haptically reinforced signal. Expand
The Dominance of Touch by Vision: Sometimes Incomplete
TLDR
It is concluded (i) that repeated observations do not destroy the effect, (ii) that considerable discrepancies between the visual and haptic images do notdestroy theeffect, but (iii) that feedback from a bare hand can reduce the effect when it is of considerable magnitude. Expand
Visual capture of haptically judged depth
TLDR
Using a graduated scale, S was required to match with his left hand the depth of three objects of equal physical depth (or thickness) held with the right hand, thus demonstrating visual capture of haptic depth. Expand
Texture perception: studies of intersensory organization using a discrepancy paradigm, and visual versus tactual psychophysics.
  • S. Lederman, S. Abbott
  • Psychology, Medicine
  • Journal of experimental psychology. Human perception and performance
  • 1981
TLDR
The comparability of the two senses in texture-related tasks may underlie the relatively equal compromise between discrepant sources of texture information demonstrated in Experiment (modality superiority interpretation). Expand
Integration of proprioceptive and visual position-information: An experimentally supported model.
TLDR
The proposed model can explain the unexpectedly small sizes of the variable errors in the localization of a seen hand that were reported earlier and implies that the CNS has knowledge about the direction-dependent precision of the proprioceptive and visual information. Expand
Haptic Dominance in Form Perception with Blurred Vision
TLDR
Three experiments are reported in which subjects were exposed to discrepant visual and haptic form information and proposed that touch may be dominant in form perception when vision is peripheral and blurry. Expand
Visual Capture of Touch: Out-of-the-Body Experiences With Rubber Gloves
TLDR
It is shown that vision can capture tactile localization, and participants were more likely to report the illusion of feeling touch at the rubber hands when they were spatially aligned with the participant's own hands. Expand
Visual and tactual texture perception: Intersensory cooperation
  • M. Heller
  • Psychology, Medicine
  • Perception & psychophysics
  • 1982
TLDR
In three experiments, subjects were required to make texture judgments about abrasive surfaces, and it is suggested that touch may preempt vision when both sources of texture information are simultaneously available. Expand
Shape from texture: ideal observers and human psychophysics
We describe an ideal observer model for estimating “shape from texture” which is derived from the principles of statistical information. For a given family of surface shapes, measures of statisticalExpand
Computational models of sensorimotor integration
Abstract The sensorimotor integration system can be viewed as an observer attempting to estimate its own state and the state of the environment by integrating multiple sources of information. WeExpand
...
1
2
3
4
5
...