Parametrically Dissociating Speech and Nonspeech Perception in the Brain Using fMRI

@article{Benson2001ParametricallyDS,
  title={Parametrically Dissociating Speech and Nonspeech Perception in the Brain Using fMRI},
  author={Randall R. Benson and Douglas H. Whalen and Matthew L. Richardson and Brook Swainson and V. Clark and Song Lai and Alvin M. Liberman},
  journal={Brain and Language},
  year={2001},
  volume={78},
  pages={364-396}
}
Candidate brain regions constituting a neural network for preattentive phonetic perception were identified with fMRI and multivariate multiple regression of imaging data. Stimuli contrasted along speech/nonspeech, acoustic, or phonetic complexity (three levels each) and natural/synthetic dimensions. Seven distributed brain regions' activity correlated with speech and speech complexity dimensions, including five left-sided foci [posterior superior temporal gyrus (STG), angular gyrus, ventral… 

Figures and Tables from this paper

Defining a left-lateralized response specific to intelligible speech using fMRI.
TLDR
The results demonstrate that there are neural responses to intelligible speech along the length of the left lateral temporal neocortex, although the precise processing roles of the anterior and posterior regions cannot be determined from this study.
Distinct fMRI responses to laughter, speech, and sounds along the human peri-sylvian cortex.
Differentiation of speech and nonspeech processing within primary auditory cortex.
TLDR
Functional magnetic resonance imaging revealed different brain activation patterns in listening to speech and nonspeech sounds across different levels of complexity, consistent with the existence of a specialized speech system which bypasses more typical processes at the earliest cortical level.
Phonetic processing areas revealed by sinewave speech and acoustically similar non-speech
Neural bases of categorization of simple speech and nonspeech sounds
TLDR
fMRI results show that a core group of regions beyond the auditory cortices were preferentially activated for familiar speech categories and for novel nonspeech categories, and there is no inherent left hemisphere advantage in the categorical processing of speech stimuli, or for the categorization task itself.
Differentiation of speech and nonspeech processing within primary auditory cortexa )
Primary auditory cortex PAC , located in Heschl’s gyrus HG , is the earliest cortical level at which sounds are processed. Standard theories of speech perception assume that signal components are
Left middle temporal gyrus activation during a phonemic discrimination task
TLDR
Nine healthy volunteers discriminated sets of phonemes and tones interspersed by rest periods in a block design paradigm while undergoing fMRI, revealing pure left lateralized superior and middle temporal gyrus activations in phonemic discrimination.
Listening to speech activates motor areas involved in speech production
TLDR
The findings of this fMRI study support the view that the motor system is recruited in mapping acoustic inputs to a phonetic code.
...
...

References

SHOWING 1-10 OF 147 REFERENCES
Functional anatomy of inner speech and auditory verbal imagery
TLDR
The data suggest that the silent articulation of sentences involves activity in an area concerned with speech generation, while imagining speech is associated with additional activity in regions associated with speech perception.
Lateralization of phonetic and pitch discrimination in speech processing.
TLDR
Processing changes in pitch produced activation of the right prefrontal cortex, consistent with the importance of right-hemisphere mechanisms in pitch perception.
Lateralization of Speech and Auditory Temporal Processing
TLDR
Evidence is provided that auditory processing of rapid acoustic transitions is lateralized in the human brain from the lowest levels of cortical processing.
Function of the left planum temporale in auditory and linguistic processing
Differential fMRI Responses in the Left Posterior Superior Temporal Gyrus and Left Supramarginal Gyrus to Habituation and Change Detection in Syllables and Tones
TLDR
The analysis of the decreases and increases in the BOLD signal across the STD, DEV, and rest conditions suggests that the left posterior superior temporal gyrus is implicated in the preattentive change detection of acoustic changes in speech as well as nonspeech stimuli, whereas the left supramarginal g Cyrus is more specifically engaged in the detection of changes in phonological units.
PET studies of phonetic processing of speech: review, replication, and reanalysis.
TLDR
The findings support a model whereby articulatory processes involving a portion of Broca's area are important when phonetic segments must be extracted and manipulated, whereas left posterior temporal cortex is involved in perceptual analysis of speech.
Functional magnetic resonance imaging of human auditory cortex
TLDR
The utility of magnetic resonance imaging in the study of human brain structure‐function relationships is confirmed and the role of the superior temporal gyrus in perception of acoustic‐phonetic features of speech, rather than processing of semantic features is emphasized.
Right hemisphere speech perception revealed by amobarbital injection and electrical interference
TLDR
The results suggest a functionally symmetric, parallel system in the adult brain with preferential use of left hemispheric pathways for speech perception.
The Cortical Representation of Speech
In this study, we compare regional cerebral blood flow (rCBF) while French monolingual subjects listen to continuous speech in an unknown language, to lists of French words, or to meaningful and
PET Studies of Auditory and Phonological Processing: Effects of Stimulus Characteristics and Task Demands
TLDR
A converging study involving performance of orthographic and phonological word discrimination tasks supports anatomical and behavioral evidence suggesting the left frontal opercular region is important for certain types of auditory/temporal analysis, as well as high-level articulatory coding.
...
...