Automatic audiovisual integration in speech perception

@article{Gentilucci2005AutomaticAI,
  title={Automatic audiovisual integration in speech perception},
  author={Maurizio Gentilucci and Luigi Cattaneo},
  journal={Experimental Brain Research},
  year={2005},
  volume={167},
  pages={66-75}
}
Two experiments aimed to determine whether features of both the visual and acoustical inputs are always merged into the perceived representation of speech and whether this audiovisual integration is based on either cross-modal binding functions or on imitation. In a McGurk paradigm, observers were required to repeat aloud a string of phonemes uttered by an actor (acoustical presentation of phonemic string) whose mouth, in contrast, mimicked pronunciation of a different string (visual… Expand
Imitation during phoneme production
TLDR
The data suggest that both features of the lip kinematics and of the voice spectra tend to be automatically imitated when repeating a string-of-phonemes presented by a visible and/or audible speaking interlocutor. Expand
Animated virtual characters to explore audio-visual speech in controlled and naturalistic environments
TLDR
The results conclusively demonstrate that computer-generated speech stimuli are judicious, and that they can supplement natural speech with higher control over stimulus timing and content. Expand
Converging toward a common speech code: imitative and perceptuo-motor recalibration processes in speech production
TLDR
In a non-interactive situation of communication, online unintentional and voluntary imitative changes in relevant acoustic features of acoustic vowel targets are demonstrated during speech production and imitation and suggest that speech production continuously draws on perceptual learning from the external speech environment. Expand
Multisensory Interactions in Speech Perception
The perception of someone talking provides correlated input to more than one sensory modality (mainly vision and audition) simultaneously. There are many everyday situations (such as face-to-faceExpand
Shared and distinct neural correlates of vowel perception and production
Recent neurobiological models postulate that sensorimotor interactions play a key role in speech perception and speech motor control, especially under adverse listening conditions or in case ofExpand
Silent articulation modulates auditory and audiovisual speech perception
TLDR
It is shown that silently articulating a syllable in synchrony with the presentation of a concordant auditory and/or visually ambiguous speech stimulus improves its identification and, even in the case of perfect perceptual identification, concurrent mouthing of a syllables speeds up the perceptual processing of a Concordant speech stimulus. Expand
Reduced audiovisual integration in synesthesia--evidence from bimodal speech perception.
TLDR
The results indicate that rather than having a hypersensitive binding mechanism, synesthetes show weaker integration of vision and audition, which is in the opposite direction as hypothesized. Expand
The McGurk Effect In Relation to Musicians' Abilities
The original study of the McGurk Effect, a perceptual phenomenon caused by contradictory audiovisual stimuli fusing together to create the illusion of a third sound, was carried out by psychologistsExpand
An investigation of audiovisual speech perception using the McGurk effect
An investigation of audiovisual speech perception using the McGurk effect by Debshila Basu Mallick Integrating information from the auditory and visual modalities is vital for speech perception. InExpand
On the tip of the tongue: Modulation of the primary motor cortex during audiovisual speech perception
TLDR
Exposure to syllables incorporating visual and/or acoustic tongue-related phonemes induced a greater excitability of the left tongue primary motor cortex as early as 100-200ms after the consonantal onset of the acoustically presented syllable, providing evidence that both visual and auditory modalities specifically modulate activity in the tonguePrimary motor cortex at an early stage during audiovisual speech perception. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 39 REFERENCES
Mandarin speech perception by ear and eye follows a universal principle
TLDR
The results are consistent with the idea that although there may be differences in information, the underlying nature of audiovisual speech processing is similar across languages. Expand
Auditory-visual speech perception examined by fMRI and PET
Cross-modal binding in auditory-visual speech perception was investigated by using the McGurk effect, a phenomenon in which hearing is altered by incongruent visual mouth movements. We usedExpand
Inter-language differences in the influence of visual cues in speech perception.
This paper reports on inter-language or cultural differences in audio-visual speech perception. When presented an audio signal dubbed onto a video recording of a talking face producing an incongruentExpand
Reading Speech from Still and Moving Faces: The Neural Substrates of Visible Speech
Speech is perceived both by ear and by eye. Unlike heard speech, some seen speech gestures can be captured in stilled image sequences. Previous studies have shown that in hearing people, naturalExpand
Hearing by eye 2 : advances in the psychology of speechreading and auditory-visual speech
Part 1 Audiovisual speech processing - implications for theories of speech perception: the use of auditory and visual information during phonetic processing - implications for theories of speechExpand
The motor theory of speech perception revised
A motor theory of speech perception, initially proposed to account for results of early experiments with synthetic speech, is now extensively revised to accommodate recent findings, and to relate theExpand
Lipreading and audio-visual speech perception.
  • Q. Summerfield
  • Psychology, Medicine
  • Philosophical transactions of the Royal Society of London. Series B, Biological sciences
  • 1992
TLDR
Progress in understanding the psychology of lipreading and audio-visual speech perception is reviewed and how are the sights and sounds of talking faces represented at their conflux is reviewed. Expand
Hearing lips and seeing voices
TLDR
The study reported here demonstrates a previously unrecognised influence of vision upon speech perception, on being shown a film of a young woman's talking head in which repeated utterances of the syllable [ba] had been dubbed on to lip movements for [ga]. Expand
Execution and observation of bringing a fruit to the mouth affect syllable pronunciation
TLDR
The results of the present study suggest that the execution and observation of the bringing‐to‐the‐mouth action activate a mouth articulation posture (probably due to the act of food manipulation with the mouth) which selectively influences speech production. Expand
Activation of auditory cortex during silent lipreading.
TLDR
Three experiments suggest that these auditory cortical areas are not engaged when an individual is viewing nonlinguistic facial movements but appear to be activated by silent meaningless speechlike movements (pseudospeech), which supports psycholinguistic evidence that seen speech influences the perception of heard speech at a prelexical stage. Expand
...
1
2
3
4
...