Hearing speech sounds: Top-down influences on the interface between audition and speech perception

@article{Davis2007HearingSS,
  title={Hearing speech sounds: Top-down influences on the interface between audition and speech perception},
  author={Matthew H. Davis and Ingrid S. Johnsrude},
  journal={Hearing Research},
  year={2007},
  volume={229},
  pages={132-147}
}

Figures from this paper

Cortical Measures of Phoneme-Level Speech Encoding Correlate with the Perceived Clarity of Natural Speech
TLDR
The availability of prior information enhanced the perceived clarity of degraded speech, which was positively correlated with changes in phoneme-level encoding across subjects, and the possibility that prior information affects the early encoding of natural speech in a dual manner is suggested.
Physical and Perceptual Factors Shape the Neural Mechanisms That Integrate Audiovisual Signals in Speech Comprehension
TLDR
Audiovisual speech comprehension emerges in an interactive process with the integration of auditory and visual signals being progressively constrained by stimulus intelligibility along the STS and spectrotemporal structure in a dorsal fronto-temporal circuitry.
Semantic Context Enhances the Early Auditory Encoding of Natural Speech
TLDR
A novel approach is addressed using a recently introduced method for quantifying the semantic context of speech and relating it to a commonly used method for indexing low-level auditory encoding of speech to suggest a mechanism that links top-down prior information with bottom-up sensory processing in the context of natural, narrative speech listening.
Keynote 1 Hierarchical abstraction in speech perception: evidence from degraded speech
What cognitive and neural processes mediate between acoustic representations of speech and the lexical representations that grant access to the meaning of spoken words? Traditional views have argued
Fast frequency modulation is encoded according to the listener expectations in the human subcortical auditory pathway
TLDR
The results indicate that FM-sweeps are already encoded at the level of the human auditory midbrain and that encoding is mainly driven by subjective expectations, and conclude that the subcortical auditory pathway is integrated in the cortical network of predictive speech processing.
Hierarchical Organization of Auditory and Motor Representations in Speech Perception: Evidence from Searchlight Similarity Analysis
TLDR
Searchlight representational similarity analysis (RSA) is used to localize and characterize neural representations of syllables at different levels of the hierarchically organized temporo-frontal pathways for speech perception, finding evidence for a graded hierarchy of abstraction across the brain.
The link between healthy speech perception and pathological voice hearing - a pivotal role for Broca's region
TLDR
It is demonstrated that Broca’s region is multifunctional and forms an important hub in top-down driven speech perception where degraded bottom-up speech signals are restored to meaningful sentences based on lexical-semantic expectations as well as in the pathology of AVH where speech is perceived in the absence of bottom- up sensory signals.
Hierarchical Processing for Speech in Human Auditory Cortex and Beyond
TLDR
The clear and vocoded sentences used by Okada et al. (2010) provided two physically dissimilar presentations of intelligible speech that the authors could use to identify acoustically insensitive neural responses; spectrally rotated stimuli allowed the authors to look for response changes due to intelligibility, independent of reductions in spectral detail.
...
...

References

SHOWING 1-10 OF 186 REFERENCES
Speech perception at the interface of neurobiology and linguistics
TLDR
One important goal of the research programme is to develop linking hypotheses between putative neurobiological primitives and those primitives derived from linguistic inquiry, to arrive ultimately at a biologically sensible and theoretically satisfying model of representation and computation in speech.
The neuroanatomical and functional organization of speech perception
Motor cortex maps articulatory features of speech sounds
TLDR
Sound-related somatotopic activation in precentral gyrus shows that, during speech perception, specific motor circuits are recruited that reflect phonetic distinctive features of the speech sounds encountered, thus providing direct neuroimaging support for specific links between the phonological mechanisms for speech perception and production.
Speech motor control: Acoustic goals, saturation effects, auditory feedback and internal models
Perceptual learning of noise vocoded words: effects of feedback and lexicality.
TLDR
Findings point to the crucial involvement of phonological short-term memory and top-down processes in the perceptual learning of noise-vocoded speech.
Contributions of sensory input, auditory search and verbal comprehension to cortical activity during speech processing.
TLDR
A neuroanatomical framework for the functional components at work during natural speech processing is delineated, i.e. when comprehension results from concurrent acoustic processing and effortful auditory search.
Hierarchical Processing in Spoken Language Comprehension
TLDR
Functional magnetic resonance imaging is used to explore the brain regions that are involved in spoken language comprehension, fractionating this system into sound-based and more abstract higher-level processes.
...
...