• Publications
  • Influence
Humans integrate visual and haptic information in a statistically optimal fashion
When a person looks at an object while exploring it with their hand, vision and touch both provide information for estimating the properties of the object. Vision frequently dominates the integratedExpand
  • 3,569
  • 263
  • PDF
Merging the senses into a robust percept
To perceive the external environment our brain uses multiple sources of sensory information derived from several different modalities, including vision, touch and audition. All these differentExpand
  • 1,439
  • 96
  • PDF
The Rubber Hand Illusion: Feeling of Ownership and Proprioceptive Drift Do Not Go Hand in Hand
In the Rubber Hand Illusion, the feeling of ownership of a rubber hand displaced from a participant's real occluded hand is evoked by synchronously stroking both hands with paintbrushes. A change ofExpand
  • 324
  • 33
  • PDF
The statistical determinants of adaptation rate in human reaching.
Rapid reaching to a target is generally accurate but also contains random and systematic error. Random errors result from noise in visual measurement, motor planning, and reach execution. SystematicExpand
  • 366
  • 22
  • PDF
A Bayesian view on multimodal cue integration
TLDR
We perceive our own body and the world surrounding us via multiple sources of sensory information derived from several modalities, including vision, touch and audition. Expand
  • 192
  • 21
  • PDF
Learning to integrate arbitrary signals from vision and touch.
  • M. Ernst
  • Psychology, Medicine
  • Journal of vision
  • 2 March 2007
When different perceptual signals of the same physical property are integrated, for example, an objects' size, which can be seen and felt, they form a more reliable sensory estimate (e.g., M. O.Expand
  • 266
  • 20
  • PDF
Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses
Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimateExpand
  • 428
  • 19
  • PDF
Experience can change the 'light-from-above' prior
To interpret complex and ambiguous input, the human visual system uses prior knowledge or assumptions about the world. We show that the 'light-from-above' prior, used to extract information aboutExpand
  • 309
  • 16
  • PDF
Optimal integration of shape information from vision and touch
Many tasks can be carried out by using several sources of information. For example, an object’s size and shape can be judged based on visual as well as haptic cues. It has been shown recently thatExpand
  • 191
  • 14
  • PDF
Touch can change visual slant perception
TLDR
We report that haptic feedback (active touch) increases the weight of a consistent surface-slant signal relative to inconsistent signals. Expand
  • 227
  • 13
  • PDF