Learn More
Visually challenged individuals often compensate for their handicap by developing supra-normal abilities in their remaining sensory systems. Here, we examined the scalp distribution of components N1 and P3 of auditory evoked potentials during a sound localization task in four totally blind subjects who had previously shown better performance than sighted(More)
Viewing a speaker's articulatory movements substantially improves a listener's ability to understand spoken words, especially under noisy environmental conditions. It has been claimed that this gain is most pronounced when auditory input is weakest, an effect that has been related to a well-known principle of multisensory integration--"inverse(More)
Regardless of the fact that emotions are usually recognized by combining facial and vocal expressions, the multisensory nature of affect perception has scarcely been investigated. In the present study, we show results of three experiments on multisensory perception of emotions using newly validated sets of dynamic visual and non-linguistic vocal clips of(More)
In everyday life, we continuously and effortlessly integrate the multiple sensory inputs from objects in motion. For instance, the sound and the visual percept of vehicles in traffic provide us with complementary information about the location and motion of vehicles. Here, we used high-density electrical mapping and local auto-regressive average (LAURA)(More)
BACKGROUND The beneficial effects of prenatal and early postnatal intakes of omega-3 (n-3) polyunsaturated fatty acids (PUFAs) on cognitive development during infancy are well recognized. However, few studies have examined the extent to which these benefits continue to be evident in childhood. OBJECTIVE The aim of this study was to examine the relation of(More)
BACKGROUND Viewing a speaker's articulatory movements substantially improves a listener's ability to understand spoken words, especially under noisy environmental conditions. In this study we investigated the ability of patients with schizophrenia to integrate visual and auditory speech. Our objective was to determine to what extent they experience benefit(More)
Seeing a speaker's facial articulatory gestures powerfully affects speech perception, helping us overcome noisy acoustical environments. One particularly dramatic illustration of visual influences on speech perception is the "McGurk illusion", where dubbing an auditory phoneme onto video of an incongruent articulatory movement can often lead to illusory(More)
A major determinant of multisensory integration, derived from single-neuron studies in animals, is the principle of inverse effectiveness (IE), which describes the phenomenon whereby maximal multisensory response enhancements occur when the constituent unisensory stimuli are minimally effective in evoking responses. Human behavioral studies, which have(More)
BACKGROUND Both executive function deficits and slower processing speed are characteristic of children with fetal alcohol exposure, but the temporal dynamics of neural activity underlying cognitive processing deficits in fetal alcohol spectrum disorder have rarely been studied. To this end, event-related potentials (ERPs) were used to examine the nature of(More)
Under noisy listening conditions, visualizing a speaker's articulations substantially improves speech intelligibility. This multisensory speech integration ability is crucial to effective communication, and the appropriate development of this capacity greatly impacts a child's ability to successfully navigate educational and social settings. Research shows(More)