The nervous system seems to combine visual and haptic information in a fashion that is similar to a maximum-likelihood integrator, and this model behaved very similarly to humans in a visual–haptic task.
It is concluded that different mechanisms of multisensory integration are responsible for proprioceptive drift and the feeling of ownership, a process that is inhibited by asynchronous stroking, the most common control condition in Rubber Hand Illusion experiments.
Human to Kalman filter behavior was compared to determine how humans take into account the statistical properties of errors and the reliability with which those errors can be measured, and how biological systems remain responsive to changes in environmental statistics.
This chapter reviews a model that in the statistical sense describes an optimal integration mechanism for integrating sensory information and points out how this integration scheme can be incorporated in a larger framework using Bayesian decision theory (BDT).
It is reported that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are Combined.
A new mapping between two arbitrary sensory signals from vision and touch can be learned from their statistical co-occurrence such that they become integrated, and the prediction is that integration makes discrimination worse for stimuli, which are incongruent with the newly learned mapping.
It is shown that the 'light-from-above' prior, used to extract information about shape from shading is modified in response to active experience with the scene, demonstrating that priors are constantly adapted by interactiveExperience with the environment.
The results of both experiments show that focus cues can contribute to estimates of 3-D scene parameters and have an indirect effect on perceived slant via the distance estimate used in disparity scaling.
It is suggested that observers integrate visual and haptic shape information of real 3D objects in a statistically optimal fashion, and knowledge that multisensory signals arise from the same object seems to promote integration.