Audiovisual integration in human superior temporal sulcus: Inverse effectiveness and the neural processing of speech and object recognition

@article{Stevenson2009AudiovisualII,
  title={Audiovisual integration in human superior temporal sulcus: Inverse effectiveness and the neural processing of speech and object recognition},
  author={Ryan A. Stevenson and Thomas W. James},
  journal={NeuroImage},
  year={2009},
  volume={44},
  pages={1210-1223}
}
Auditory, Visual and Audiovisual Speech Processing Streams in Superior Temporal Sulcus
TLDR
The results suggest that auditory and visual speech representations are elaborated gradually within anterior and posterior processing streams, respectively, and may be integrated within the mSTS, which is sensitive to more abstract speech information within and across presentation modalities.
Physical and Perceptual Factors Shape the Neural Mechanisms That Integrate Audiovisual Signals in Speech Comprehension
TLDR
Audiovisual speech comprehension emerges in an interactive process with the integration of auditory and visual signals being progressively constrained by stimulus intelligibility along the STS and spectrotemporal structure in a dorsal fronto-temporal circuitry.
Dynamic Changes in Superior Temporal Sulcus Connectivity during Perception of Noisy Audiovisual Speech
TLDR
Results matched the results of a behavioral experiment in which the perception of incongruent audiovisual syllables was biased toward the more reliable modality, even with rapidly changing reliability, indicating that changes in STS functional connectivity may be an important neural mechanism underlying the Perception of noisy speech.
Neural correlates of multisensory enhancement in audiovisual narrative speech perception: a fMRI investigation
This fMRI study investigated the effect of seeing articulatory movements of a speaker while listening to a naturalistic narrative stimulus. It had the goal to identify regions of the language network
Neural Mechanisms of Audiovisual Integration in Integrated Processing for Verbal Perception and Spatial Factors
TLDR
This chapter summarizes the neural mechanisms and the influencing factors of audiovisual integration, including the relationship between the spatial location of auditory stimuli, visual stimuli, and verbal perception, and the combination of virtual reality technology and audiov isual integration.
Examinations of Audiovisual Speech Processes, the McGurk Effect and the Heteromodal Superior Temporal Sulcus in the Human Brain Across Numerous Approaches
TLDR
This dissertation explores the fundamental question of how the brain processes audiovisual (AV) speech using different approaches and finds that bilateral posterior temporal regions, left inferior parietal lobule and other dorsal-stream regions are recruited for conflicting AV speech.
On the variability of the McGurk effect: audiovisual integration depends on prestimulus brain states.
TLDR
The McGurk effect depends on fluctuating brain states suggesting that functional connectedness of left STS at a prestimulus stage is crucial for an audiovisual percept.
Cortical integration of audio–visual speech and non-speech stimuli
...
...

References

SHOWING 1-10 OF 82 REFERENCES
Superadditive BOLD activation in superior temporal sulcus with threshold non-speech objects
TLDR
Using video clips and sounds of hand held tools presented at psychophysical threshold, BOLD activation to audio-visual objects that surpassed the sum of the BOLD activations to audio and visual stimuli presented independently are elicit.
Detection of Audio-Visual Integration Sites in Humans by Application of Electrophysiological Criteria to the BOLD Effect
TLDR
The efficacy of using an analytic approach informed by electrophysiology to identify multisensory integration sites in humans is demonstrated and the particular network of brain areas implicated in these crossmodal integrative processes are suggested to be dependent on the nature of the correspondence between the different sensory inputs.
The neuroanatomical and functional organization of speech perception
Perceptual Fusion and Stimulus Coincidence in the Cross-Modal Integration of Speech
TLDR
In an event-related functional magnetic resonance imaging study, neural systems that evaluate cross-modal coincidence of the physical stimuli from those that mediate perceptual binding are differentiated.
Spatial and temporal factors during processing of audiovisual speech: a PET study
“Acoustical vision” of below threshold stimuli: interaction among spatially converging audiovisual inputs
TLDR
An enhancement of the perceptual sensitivity (d′) for luminance detection was found when the audiovisual stimuli followed a simple spatial and temporal rule, governing multisensory integration at the neuronal level.
Integration of Visual and Auditory Information by Superior Temporal Sulcus Neurons Responsive to the Sight of Actions
TLDR
Results suggest that neurons in the STS form multisensory representations of observed actions, and this work investigates whether STS neurons coding the sight of actions also integrated the sound of those actions.
Integration of Touch and Sound in Auditory Cortex
The role of the posterior superior temporal sulcus in audiovisual processing.
TLDR
It is concluded that, when stimulus input, task, and attention are controlled, pSTS is part of a distributed set of regions involved in conceptual matching, irrespective of whether the stimuli are audiovisual, auditory-auditory or visual-visual.
...
...