Seeing speech: visual information from lip movements modifies activity in the human auditory cortex

@article{Sams1991SeeingSV,
  title={Seeing speech: visual information from lip movements modifies activity in the human auditory cortex},
  author={Mikko Sams and Reijo Aulanko and Matti S. H{\"a}m{\"a}l{\"a}inen and Riitta Hari and Olli V. Lounasmaa and Sing Teh Lu and Juha Simola},
  journal={Neuroscience Letters},
  year={1991},
  volume={127},
  pages={141-145}
}
Neuromagnetic responses were recorded over the left hemisphere to find out in which cortical area the heard and seen speech are integrated. Auditory stimuli were Finnish/pa/syllables presented together with a videotaped face articulating either the concordant syllable/pa/(84% of stimuli, V = A) or the discordant syllable/ka/(16%, V not equal to A). In some subjects the probabilities were reversed. The subjects heard V not equal to A stimuli as/ta/ or ka. The magnetic responses to infrequent… Expand

Figures and Topics from this paper

Bimodal speech: early suppressive visual effects in human auditory cortex
TLDR
Bimodal syllables were identified more rapidly than auditory alone stimuli and the latency of the effect indicates that integration operates at pre‐representational stages of stimulus analysis, probably via feedback projections from visual and/or polymodal areas. Expand
The effect of viewing speech on auditory speech processing is different in the left and right hemispheres
TLDR
It is suggested that the effect of processing visual speech seen in the right hemisphere likely reflects suppression of the auditory response based on AV cues for place of articulation. Expand
Time course of multisensory interactions during audiovisual speech perception in humans: a magnetoencephalographic study
TLDR
Differences between AV and A+V responses were found bilaterally in the auditory cortices and in the right superior temporal sulcus (STS) 250-600 ms after stimulus onset, showing that both sensory-specific and multisensory regions of the human temporal cortices are involved in AV speech processing. Expand
Primary auditory cortex activation by visual speech: an fMRI study at 3 T
TLDR
Visual speech perception activated Heschl's gyri in nine subjects, with activation in seven of them extending to the area of primary auditory cortex, suggesting left HeschL's gyrus specialization for visual speech processing. Expand
Auditory-visual speech perception examined by fMRI and PET
Cross-modal binding in auditory-visual speech perception was investigated by using the McGurk effect, a phenomenon in which hearing is altered by incongruent visual mouth movements. We usedExpand
Adaptation of neuromagnetic N1 responses to phonetic stimuli by visual speech in humans
TLDR
The technique of 306-channel magnetoencephalogaphy was used in eight healthy volunteers to test whether silent lip-reading modulates auditory-cortex processing of phonetic sounds, and the amplitudes of left-hemisphere N1 responses to the test stimuli were significantly suppressed. Expand
Auditory-visual processing represented in the human superior temporal gyrus
TLDR
Findings indicate that evoked responses recorded from area PLST to auditory speech stimuli are influenced significantly by the addition of visual images of the moving lower face and lips, either articulating the audible syllable or carrying out a meaningless (gurning) motion. Expand
Seeing speech affects acoustic information processing in the human brainstem
TLDR
It is shown that lipreading during speech perception influences early acoustic processing, and this indicates considerable plasticity in early auditory processing. Expand
Interaction between auditory and visual stimulus relating to the vowel sounds in the auditory cortex in humans: a magnetoencephalographic study
TLDR
The findings indicated that the vowel sound perception in the auditory cortex, at least in the primary processing stage, was not affected by viewing mouth movement. Expand
Activation of the human auditory cortex by speech sounds.
  • R. Hari
  • Psychology, Medicine
  • Acta oto-laryngologica. Supplementum
  • 1991
TLDR
It seems that MEG studies can be useful in the study of brain mechanisms underlying speech perception in intact humans, and suggest that the auditory system performs a very similar analysis of both speech signals and other sounds. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 21 REFERENCES
Visual influences on speech perception processes
TLDR
It is confirmed and extended an earlier observation that visual information for the speaker’s lip movements profoundly modifies the auditorv perception of natural speech by normally hearing subjects. Expand
Hearing lips and seeing voices
TLDR
The study reported here demonstrates a previously unrecognised influence of vision upon speech perception, on being shown a film of a young woman's talking head in which repeated utterances of the syllable [ba] had been dubbed on to lip movements for [ga]. Expand
Convergence of auditory and visual stimuli on single cells in the primary visual cortex of unanesthetized unrestrained cats.
TLDR
There exists a high degree of interaction between the auditory and visual pathways resulting in convergence at the single-cell level, in the primary visual cortex of unanesthetized unrestrained cats, according to a statistically significant convergence of photic and acoustic responses. Expand
Auditory-visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey
TLDR
The results indicate that neurons in both regions show auditory-visual interactions and that at least some of these interactions are due to convergence at the cortical cell. Expand
Magnetic auditory responses from the human brain. A preliminary report.
TLDR
The evoked magnetic field is widely distributed across the scalp and seems to be produced by an equivalent magnetic dipole located in or near the primary auditory cortex, although in the present experiment neither right--left hemisphere nor ipsi--contralateral differences could be demonstrated. Expand
Cortical activity elicited by changes in auditory stimuli: different sources for the magnetic N100m and mismatch responses.
TLDR
All mismatch fields, i.e., responses elicited by different deviant tones, as well as N100m to the standards and deviants, could be explained by neural activity in the supratemporal auditory cortex. Expand
The Role of Vision in the Perception of Speech
  • B. Dodd
  • Psychology, Medicine
  • Perception
  • 1977
TLDR
The ability of hearing adolescents to use speech-read information in order to report lists of CVC words presented in white noise was investigated and it was indicated that vision provided a complementary source of information for the perception of speech. Expand
Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration.
TLDR
Data show that multisensory convergence provides the descending efferent cells of the SC with a dynamic response character, which can vary in response to the particular complex of stimuli present in the environment at any given moment. Expand
Auditory and visual contributions to the perception of consonants.
Perceptual confusions of 16 consonant-vowel (CV) syllables were studied with normal-hearing adults under conditions of auditory-visual presentation at various signal-to-noise (S/N) ratios, as well ...
Polysensory properties of neurons in the anterior bank of the caudal superior temporal sulcus of the macaque monkey.
TLDR
The sensory properties of cells in the anterior bank of the caudal part of the superior temporal sulcus (caudal STS) in anesthetized, paralyzed monkeys to visual, auditory, and somesthetic stimuli agreed with previous studies. Expand
...
1
2
3
...