Learn More
Visually challenged individuals often compensate for their handicap by developing supra-normal abilities in their remaining sensory systems. Here, we examined the scalp distribution of components N1 and P3 of auditory evoked potentials during a sound localization task in four totally blind subjects who had previously shown better performance than sighted(More)
Viewing a speaker's articulatory movements substantially improves a listener's ability to understand spoken words, especially under noisy environmental conditions. It has been claimed that this gain is most pronounced when auditory input is weakest, an effect that has been related to a well-known principle of multisensory integration--"inverse(More)
Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of(More)
BACKGROUND Viewing a speaker's articulatory movements substantially improves a listener's ability to understand spoken words, especially under noisy environmental conditions. In this study we investigated the ability of patients with schizophrenia to integrate visual and auditory speech. Our objective was to determine to what extent they experience benefit(More)
A major determinant of multisensory integration, derived from single-neuron studies in animals, is the principle of inverse effectiveness (IE), which describes the phenomenon whereby maximal multisensory response enhancements occur when the constituent unisensory stimuli are minimally effective in evoking responses. Human behavioral studies, which have(More)
Seeing a speaker's facial articulatory gestures powerfully affects speech perception, helping us overcome noisy acoustical environments. One particularly dramatic illustration of visual influences on speech perception is the "McGurk illusion", where dubbing an auditory phoneme onto video of an incongruent articulatory movement can often lead to illusory(More)
In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA)(More)
BACKGROUND Both executive function deficits and slower processing speed are characteristic of children with fetal alcohol exposure, but the temporal dynamics of neural activity underlying cognitive processing deficits in fetal alcohol spectrum disorder have rarely been studied. To this end, event-related potentials (ERPs) were used to examine the nature of(More)
The Inuit from Nunavik (Northern Québec) are one of the most highly exposed populations to environmental contaminants in North America mainly due to the bioaccumulation of contaminants in fish and marine mammals that constitute an important part of their diet. This follow-up study aimed to assess the impact of exposure to contaminants on visual brain(More)
Methylmercury (MeHg) and polychlorinated biphenyls (PCBs) are seafood contaminants known for their adverse effects on neurodevelopment. This study examines the relation of developmental exposure to these contaminants to information processing assessed with event-related potentials (ERPs) in school-aged Inuit children from Nunavik (Arctic Québec). In a(More)