Speech Perception

  title={Speech Perception},
  author={David Poeppel and Philip J. Monahan},
  journal={Current Directions in Psychological Science},
  pages={80 - 85}
Speech perception includes, minimally, the set of computations that transform continuously varying acoustic signals into linguistic representations that can be used for subsequent processing. The auditory and motor subroutines of this complex perceptual process are executed in a network of brain areas organized in ventral and dorsal parallel pathways, performing sound-to-meaning and sound-to-motor mappings, respectively. Research on speech using neurobiological techniques argues against narrow… 

Figures from this paper

Early recognition of speech
This work has shown that the perceptual organization of speech is keyed to modulation; fast; unlearned; nonsymbolic; indifferent to short-term auditory properties; and organization requires attention.
On The Way To Linguistic Representation: Neuromagnetic Evidence of Early Auditory Abstraction in the Perception of Speech and Pitch
It is shown that even at the earliest (non-invasive) recordable stages of auditory cortical processing, the authors find evidence that cortex is calculating abstract representations from the acoustic signal.
Encoding speech through brain rhythms
A neural microcircuit model of nested oscillations for early auditory processing, involving a fast Gamma rhythm coupled to a slow Theta rhythm, found that Bottom-up information is dominated by Gamma oscillations, while Top-Down flow is conveyed through Theta oscillations.
Cognitive Phonetics: The Transduction of Distinctive Features at the Phonology-Phonetics Interface
It is argued that CP augments the study of certain phonetic phenomena, most notably coarticulation, and suggest that some phenomena usually considered phonological receive better explanations within CP.
The missing link in the embodiment of syntax: Prosody
Investigation of the spatiotemporal dynamics of the brain during perceiving words
The authors' findings demonstrate a quite early stage of speech cognitive process and present the dynamics of interaction between the two streams during perceiving words, which are consistent with the dual-stream model.
Speech-brain synchronization: a possible cause for developmental dyslexia
Dyslexia is a neurological learning disability characterized by the difficulty in an individual?s ability to read despite adequate intelligence and normal opportunities. The majority of dyslexic
Current advances in neurolinguistics: the use of electroencephalography (EEG) to study language
This article discusses how cognitive neuroscience techniques are used to study language with the aim of showing its advantages to neurolinguistics. It focuses on Electroencephalography (EEG) because
Auditory agnosia.


Speech perception at the interface of neurobiology and linguistics
One important goal of the research programme is to develop linking hypotheses between putative neurobiological primitives and those primitives derived from linguistic inquiry, to arrive ultimately at a biologically sensible and theoretically satisfying model of representation and computation in speech.
Listening to speech activates motor areas involved in speech production
The findings of this fMRI study support the view that the motor system is recruited in mapping acoustic inputs to a phonetic code.
Identification of a pathway for intelligible speech in the left temporal lobe.
It is demonstrated that the left superior temporal sulcus responds to the presence of phonetic information, but its anterior part only responds if the stimulus is also intelligible, demonstrating a left anterior temporal pathway for speech comprehension.
The cortical organization of speech processing
A dual-stream model of speech processing is outlined that assumes that the ventral stream is largely bilaterally organized — although there are important computational differences between the left- and right-hemisphere systems — and that the dorsal stream is strongly left- Hemisphere dominant.
The motor theory of speech perception revised
Visual speech speeds up the neural processing of auditory speech.
The observed auditory-visual data support the view that there exist abstract internal representations that constrain the analysis of subsequent speech inputs, and provide evidence for the existence of an "analysis-by-synthesis" mechanism in auditory- visual speech perception.
Phase Patterns of Neuronal Responses Reliably Discriminate Speech in Human Auditory Cortex
Musical experience shapes human brainstem encoding of linguistic pitch patterns
This work examined brainstem encoding of linguistic pitch and found that musicians show more robust and faithful encoding compared with nonmusicians, suggesting a possible reciprocity of corticofugal speech and music tuning, providing neurophysiological explanations for musicians' higher language-learning ability.
The influence of meaning on the perception of speech sounds.
Using magnetoencephalographic brain recordings with speakers of Russian and Korean, it is demonstrated that a speaker's perceptual space is shaped not only by bottom-up analysis of the distribution of sounds in his language but also by more abstract analysis ofThe functional significance of those sounds.