Learn More
Across three experiments, participants made speeded elevation discrimination responses to vibrotactile targets presented to the thumb (held in a lower position) or the index finger (upper position) of either hand, while simultaneously trying to ignore visual distractors presented independently from either the same or a different elevation. Performance on(More)
In a study that builds on recent cognitive neuroscience research on body perception and social psychology research on social relations, we tested the hypothesis that synchronous multisensory stimulation leads to self-other merging. We brushed the cheek of each study participant as he or she watched a stranger's cheek being brushed in the same way, either in(More)
In order to determine precisely the location of a tactile stimulus presented to the hand it is necessary to know not only which part of the body has been stimulated, but also where that part of the body lies in space. This involves the multisensory integration of visual, tactile, proprioceptive, and even auditory cues regarding limb position. In recent(More)
Shadows in visual scenes can have profound effects on visual perception. Here we have found that visual distracters distant from the body interfere with human spatial discrimination of tactile targets at a hand, particularly when the shadow of the stimulated hand stretches toward them in extrapersonal space. These findings suggest that shadows cast by a(More)
BACKGROUND While the sense of bodily ownership has now been widely investigated through the rubber hand illusion (RHI), very little is known about the sense of disownership. It has been hypothesized that the RHI also affects the ownership feelings towards the participant's own hand, as if the rubber hand replaced the participant's actual hand. Somatosensory(More)
Cross-modal spatial integration between auditory and visual stimuli is a common phenomenon in space perception. The principles underlying such integration have been outlined by neurophysiological and behavioral studies in animals (Stein & Meredith, 1993), but little evidence exists proving that similar principles occur also in humans. In the present study,(More)
When a hand (either real or fake) is stimulated in synchrony with our own hand concealed from view, the felt position of our own hand can be biased toward the location of the seen hand. This intriguing phenomenon relies on the brain's ability to detect statistical correlations in the multisensory inputs (ie visual, tactile, and proprioceptive), but it is(More)
The portion of space that closely surrounds our body parts is termed peripersonal space, and it has been shown to be represented in the brain through multisensory processing systems. Here, we tested whether voluntary actions, such as grasping an object, may remap such multisensory spatial representation. Participants discriminated touches on the hand they(More)
Individuals with profound deafness rely critically on vision to interact with their environment. Improvement of visual performance as a consequence of auditory deprivation is assumed to result from cross-modal changes occurring in late stages of visual processing. Here we measured reaction times and event-related potentials (ERPs) in profoundly deaf adults(More)
In human adults, visual dominance emerges in several multisensory tasks. In children, auditory dominance has been reported up to 4 years of age. To establish when sensory dominance changes during development, 41 children (6-7, 9-10, and 11-12 years) were tested on the Colavita task (Experiment 1) and 32 children (6-7, 9-10, and 11-12 years) were tested on(More)