Eric Tardif

Learn More
Interaural intensity and time differences (IID and ITD) are two binaural auditory cues for localizing sounds in space. This study investigated the spatio-temporal brain mechanisms for processing and integrating IID and ITD cues in humans. Auditory-evoked potentials were recorded, while subjects passively listened to noise bursts lateralized with IID, ITD or(More)
We investigated how synaptic plasticity is related to the neurodegeneration process in the human dorsolateral prefrontal cortex. Pre- and postsynaptic proteins of Brodmann's area 9 from patients with Alzheimer's disease (AD) and age-matched controls were quantified by immunohistochemical methods and Western blots. The main finding was a significant increase(More)
Partially segregated neuronal pathways ("what" and "where" pathways, respectively) are thought to mediate sound recognition and localization. Less studied are interactions between these pathways. In two experiments, we investigated whether near-threshold pitch discrimination sensitivity (d') is altered by supra-threshold task-irrelevant position differences(More)
Auditory spatial representations are likely encoded at a population level within human auditory cortices. We investigated learning-induced plasticity of spatial discrimination in healthy subjects using auditory-evoked potentials (AEPs) and electrical neuroimaging analyses. Stimuli were 100 ms white-noise bursts lateralized with varying interaural time(More)
The human primary auditory cortex (AI) is surrounded by several other auditory areas, which can be identified by cyto-, myelo- and chemoarchitectonic criteria. We report here on the pattern of calcium-binding protein immunoreactivity within these areas. The supratemporal regions of four normal human brains (eight hemispheres) were processed histologically,(More)
We present a new brain segmentation framework which we apply to T1-weighted magnetic resonance image segmentation. The innovation of the algorithm in comparison to the state-of-the-art of non-supervised brain segmentation is twofold. First, the algorithm is entirely non-parametric and non-supervised. We can therefore enhance the classically used gray level(More)
Auditory scene analysis requires the accurate encoding and comparison of the perceived spatial positions of sound sources. The electrophysiological correlates of auditory spatial discrimination and their relationship to performance accuracy were studied in humans by applying electrical neuroimaging analyses to auditory evoked potentials (AEPs) that were(More)
Auditory spatial deficits occur frequently after hemispheric damage; a previous case report suggested that the explicit awareness of sound positions, as in sound localisation, can be impaired while the implicit use of auditory cues for the segregation of sound objects in noisy environments remains preserved. By assessing systematically patients with a first(More)
When two sounds are presented sequentially within a short delay ( approximately 10ms), the listener perceives a single auditory event, the location of which is dominated by the directional information conveyed by the leading sound (the precedence effect, PE). The PE is not always instantaneous, but has been shown to build-up across repetitions of lead-lag(More)
Broca's area and its right hemisphere homologue comprise 2 cytoarchitectonic subdivisions, FDgamma and FCBm of von Economo C and Koskinas GN (1925, Die Cytoarchitektonik der Hirnrinde des erwachsenen Menschen. Vienna/Berlin [Germany]: Springer). We report here on intrinsic connections within these areas, as revealed with biotinylated dextran amine and(More)