• Publications
  • Influence
The perception of emotions by ear and by eye
Emotions are expressed in the voice as well as on the face. As a first step to explore the question of their integration, we used a bimodal perception situation modelled after the McGurk paradigm, inExpand
  • 533
  • 49
  • PDF
Neural Correlates of Multisensory Integration of Ecologically Valid Audiovisual Events
TLDR
A question that has emerged over recent years is whether audiovisual (AV) speech perception is a special case of multi-sensory perception. Expand
  • 289
  • 47
  • PDF
Recalibration of temporal order perception by exposure to audio-visual asynchrony.
The perception of simultaneity between auditory and visual information is of crucial importance for maintaining a coordinated representation of a multisensory event. Here we show that the perceptualExpand
  • 321
  • 38
  • PDF
Visual Recalibration of Auditory Speech Identification
The kinds of aftereffects, indicative of cross-modal recalibration, that are observed after exposure to spatially incongruent inputs from different sensory modalities have not been demonstrated soExpand
  • 201
  • 22
  • PDF
Visual Anticipatory Information Modulates Multisensory Interactions of Artificial Audiovisual Stimuli
TLDR
The neural activity of speech sound processing (the N1 component of the auditory ERP) can be suppressed if a speech sound is accompanied by concordant lip movements if the temporal occurrence of this audiovisual event is made predictable. Expand
  • 118
  • 13
  • PDF
Electrophysiological evidence for speech-specific audiovisual integration
Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) areExpand
  • 62
  • 12
  • PDF
The combined perception of emotion from voice and face: early interaction revealed by human electric brain responses
Judgement of the emotional tone of a spoken utterance is influenced by a simultaneously presented face expression. The time course of this integration was investigated by measuring the mismatchExpand
  • 168
  • 11
  • PDF
Auditory Cortex Encodes the Perceptual Interpretation of Ambiguous Sound
The confounding of physical stimulus characteristics and perceptual interpretations of stimuli poses a problem for most neuroscientific studies of perception. In the auditory domain, this pertains toExpand
  • 104
  • 11
  • PDF
The time‐course of intermodal binding between seeing and hearing affective information
Intermodal binding between affective information that is seen as well as heard triggers a mandatory process of audiovisual integration. In order to track the time course of this audiovisual binding,Expand
  • 171
  • 10
  • PDF
Audio-visual integration in schizophrenia
Integration of information provided simultaneously by audition and vision was studied in a group of 18 schizophrenic patients. They were compared to a control group, consisting of 12 normal adults ofExpand
  • 109
  • 10
  • PDF