Learn More
When the apparent visual location of a body part conflicts with its veridical location, vision can dominate proprioception and kinesthesia. In this article, we show that vision can capture tactile localization. Participants discriminated the location of vibrotactile stimuli (upper, at the index finger, vs. lower, at the thumb), while ignoring distractor(More)
Across three experiments, participants made speeded elevation discrimination responses to vibrotactile targets presented to the thumb (held in a lower position) or the index finger (upper position) of either hand, while simultaneously trying to ignore visual distractors presented independently from either the same or a different elevation. Performance on(More)
The authors report a series of 6 experiments investigating crossmodal links between vision and touch in covert endogenous spatial attention. When participants were informed that visual and tactile targets were more likely on 1 side than the other, speeded discrimination responses (continuous vs. pulsed, Experiments 1 and 2; or up vs. down, Experiment 3) for(More)
 In the present study, we investigated the effects of the Titchener circles illusion in perception and action. In this illusion, two identical discs can be perceived as being different in size when one is surrounded by an annulus of smaller circles and the other is surrounded by an annulus of larger circles. This classic size-contrast illusion, known as(More)
In order to determine precisely the location of a tactile stimulus presented to the hand it is necessary to know not only which part of the body has been stimulated, but also where that part of the body lies in space. This involves the multisensory integration of visual, tactile, proprioceptive, and even auditory cues regarding limb position. In recent(More)
Perception of movement in acoustic space depends on comparison of the sound waveforms reaching the two ears (binaural cues) as well as spectrotemporal analysis of the waveform at each ear (monaural cues). The relative importance of these two cues is different for perception of vertical or horizontal motion, with spectrotemporal analysis likely to be more(More)
In close analogy with neurophysiological findings in monkeys, neuropsychological studies have shown that the human brain constructs visual maps of space surrounding different body parts. In right-brain-damaged patients with tactile extinction, the existence of a visual peripersonal space centred on the hand has been demonstrated by showing that cross-modal(More)
When a hand (either real or fake) is stimulated in synchrony with our own hand concealed from view, the felt position of our own hand can be biased toward the location of the seen hand. This intriguing phenomenon relies on the brain's ability to detect statistical correlations in the multisensory inputs (ie visual, tactile, and proprioceptive), but it is(More)
In a study that builds on recent cognitive neuroscience research on body perception and social psychology research on social relations, we tested the hypothesis that synchronous multisensory stimulation leads to self-other merging. We brushed the cheek of each study participant as he or she watched a stranger's cheek being brushed in the same way, either in(More)
Cross-modal spatial integration between auditory and visual stimuli is a common phenomenon in space perception. The principles underlying such integration have been outlined by neurophysiological and behavioral studies in animals (Stein & Meredith, 1993), but little evidence exists proving that similar principles occur also in humans. In the present study,(More)