Athena Vouloumanos

Learn More
A language learner trying to acquire a new word must often sift through many potential relations between particular words and their possible meanings. In principle, statistical information about the distribution of those mappings could serve as one important source of data, but little is known about whether learners can in fact track multiple word-referent(More)
The detection of speech in an auditory stream is a requisite first step in processing spoken language. In this study, we used event-related fMRI to investigate the neural substrates mediating detection of speech compared with that of nonspeech auditory stimuli. Unlike previous studies addressing this issue, we contrasted speech with nonspeech analogues that(More)
An essential part of the human capacity for language is the ability to link conceptual or semantic representations with syntactic representations. On the basis of data from spontaneous production, suggested that young children acquire such links on a verb-by-verb basis, with little in the way of a general understanding of linguistic argument structure.(More)
Perceptual experiences in one modality are often dependent on activity from other sensory modalities. These cross-modal correspondences are also evident in language. Adults and toddlers spontaneously and consistently map particular words (e.g., 'kiki') to particular shapes (e.g., angular shapes). However, the origins of these systematic mappings are(More)
This study shows that 4- and 6-month-old infants can discriminate languages (English from French) just from viewing silently presented articulations. By the age of 8 months, only bilingual (French-English) infants succeed at this task. These findings reveal a surprisingly early preparedness for visual language discrimination and highlight infants'(More)
The goal of this study was to explore the ability to discriminate languages using the visual correlates of speech (i.e., speech-reading). Participants were presented with silent video clips of an actor pronouncing two sentences (in Catalan and/or Spanish) and were asked to judge whether the sentences were in the same language or in different languages. Our(More)
The nature and origin of the human capacity for acquiring language is not yet fully understood. Here we uncover early roots of this capacity by demonstrating that humans are born with a preference for listening to speech. Human neonates adjusted their high amplitude sucking to preferentially listen to speech, compared with complex non-speech analogues that(More)
In everyday word learning words are only sometimes heard in the presence of their referent, making the acquisition of novel words a particularly challenging task. The current study investigated whether children (18-month-olds who are novice word learners) can track the statistics of co-occurrence between words and objects to learn novel mappings in a(More)
Adult humans recognize that even unfamiliar speech can communicate information between third parties, demonstrating an ability to separate communicative function from linguistic content. We examined whether 12-month-old infants understand that speech can communicate before they understand the meanings of specific words. Specifically, we test the(More)
How does the brain's response to speech change over the first months of life? Although behavioral findings indicate that neonates' listening biases are sharpened over the first months of life, with a species-specific preference for speech emerging by 3 months, the neural substrates underlying this developmental change are unknown. We examined neural(More)