Learn More
When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integrated visual-haptic percept, for example when judging size, shape or position, but in some circumstances the percept is clearly affected by haptics. Here we propose(More)
To perceive the external environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. All these different sources of information have to be efficiently merged to form a coherent and robust percept. Here we highlight some of the mechanisms that underlie this merging of the(More)
Rapid reaching to a target is generally accurate but also contains random and systematic error. Random errors result from noise in visual measurement, motor planning, and reach execution. Systematic error results from systematic changes in the mapping between the visual estimate of target location and the motor command necessary to reach the target (e.g.,(More)
In the Rubber Hand Illusion, the feeling of ownership of a rubber hand displaced from a participant's real occluded hand is evoked by synchronously stroking both hands with paintbrushes. A change of perceived finger location towards the rubber hand (proprioceptive drift) has been reported to correlate with this illusion. To measure the time course of(More)
When different perceptual signals of the same physical property are integrated, for example, an objects' size, which can be seen and felt, they form a more reliable sensory estimate (e.g., M. O. Ernst & M. S. Banks, 2002). This, however, implies that the sensory system already knows which signals belong together and how they relate. In other words, the(More)
The visual system uses several signals to deduce the three-dimensional structure of the environment, including binocular disparity, texture gradients, shading and motion parallax. Although each of these sources of information is independently insufficient to yield reliable three-dimensional structure from everyday scenes, the visual system combines them by(More)
Many tasks can be carried out by using several sources of information. For example, an object’s size and shape can be judged based on visual as well as haptic cues. It has been shown recently that human observers integrate visual and haptic size information in a statistically optimal fashion, in the sense that the integrated estimate is most reliable (Ernst(More)
To interpret complex and ambiguous input, the human visual system uses prior knowledge or assumptions about the world. We show that the 'light-from-above' prior, used to extract information about shape from shading is modified in response to active experience with the scene. The resultant adaptation is not specific to the learned scene but generalizes to a(More)
The nervous system often combines visual and haptic information about object properties such that the combined estimate is more precise than with vision or haptics alone. We examined how the system determines when to combine the signals. Presumably, signals should not be combined when they come from different objects. The likelihood that signals come from(More)