Learn More
Visually challenged individuals often compensate for their handicap by developing supra-normal abilities in their remaining sensory systems. Here, we examined the scalp distribution of components N1 and P3 of auditory evoked potentials during a sound localization task in four totally blind subjects who had previously shown better performance than sighted(More)
Viewing a speaker's articulatory movements substantially improves a listener's ability to understand spoken words, especially under noisy environmental conditions. It has been claimed that this gain is most pronounced when auditory input is weakest, an effect that has been related to a well-known principle of multisensory integration--"inverse(More)
Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of(More)
BACKGROUND Viewing a speaker's articulatory movements substantially improves a listener's ability to understand spoken words, especially under noisy environmental conditions. In this study we investigated the ability of patients with schizophrenia to integrate visual and auditory speech. Our objective was to determine to what extent they experience benefit(More)
A major determinant of multisensory integration, derived from single-neuron studies in animals, is the principle of inverse effectiveness (IE), which describes the phenomenon whereby maximal multisensory response enhancements occur when the constituent unisensory stimuli are minimally effective in evoking responses. Human behavioral studies, which have(More)
Seeing a speaker's facial articulatory gestures powerfully affects speech perception, helping us overcome noisy acoustical environments. One particularly dramatic illustration of visual influences on speech perception is the "McGurk illusion", where dubbing an auditory phoneme onto video of an incongruent articulatory movement can often lead to illusory(More)
In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA)(More)
Methylmercury (MeHg) and polychlorinated biphenyls (PCBs) are seafood contaminants known for their adverse effects on neurodevelopment. This study examines the relation of developmental exposure to these contaminants to information processing assessed with event-related potentials (ERPs) in school-aged Inuit children from Nunavik (Arctic Québec). In a(More)
BACKGROUND Lead (Pb) and polychlorinated biphenyls (PCBs) are neurotoxic contaminants that have been related to impairment in response inhibition. OBJECTIVES In this study we examined the neurophysiological correlates of the response inhibition deficits associated with these exposures, using event-related potentials (ERPs) in a sample of school-age Inuit(More)
BACKGROUND We assessed central and peripheral visual field processing in children with epilepsy who were exposed to vigabatrin during infancy. METHODS Steady-state visual evoked potentials and pattern electroretinograms to field-specific radial checkerboards flickering at two cycle frequencies (7.5 and 6 Hz for central and peripheral stimulations,(More)