Effects of lips and hands on auditory learning of second-language speech sounds.

@article{Hirata2010EffectsOL,
  title={Effects of lips and hands on auditory learning of second-language speech sounds.},
  author={Yukari Hirata and Spencer D. Kelly},
  journal={Journal of speech, language, and hearing research : JSLHR},
  year={2010},
  volume={53 2},
  pages={
          298-310
        }
}
  • Y. Hirata, S. Kelly
  • Published 1 April 2010
  • Linguistics
  • Journal of speech, language, and hearing research : JSLHR
PURPOSE Previous research has found that auditory training helps native English speakers to perceive phonemic vowel length contrasts in Japanese, but their performance did not reach native levels after training. Given that multimodal information, such as lip movement and hand gesture, influences many aspects of native language processing, the authors examined whether multimodal input helps to improve native English speakers' ability to perceive Japanese vowel length contrasts. METHOD Sixty… 

Figures from this paper

Effects of hand gestures on auditory learning of second-language vowel length contrasts.
TLDR
All of the training types yielded similar auditory improvement in identifying vowel length contrast, but observing the syllabic-rhythm hand gesture yielded the most balanced improvement between word-initial and word-final vowels and between slow and fast speaking rates.
Can co-speech hand gestures facilitate learning of non-native tones?
Speech perception research has indicated that information from multiple input modalities (e.g., auditory, visual) facilitates second language (L2) speech learning. However, co-speech gestural
Effects of auditory, visual and gestural input on the perceptual learning of tones
Research has shown that audio-visual speech information facilitates second language (L2) speech learning, yet multiple input modalities including co-speech gestures show mixed results. While L2
Individual Differences in L2 Speech Perception: The Role of Phonological Memory and Lipreading Ability
The current study investigated the effects of individual differences in lipreading ability, working memory (WM), and phonological short-term memory (PSTM) on second language (L2) speech perception.
Effects of perceptual training on second language vowel perception and production
ABSTRACT This study investigates whether audiovisual training leads to greater improvement in perception and production than auditory training. The participants (n = 60) were American English native
Language Experience and Subjective Word Familiarity on the Multimodal Perception of Non-native Vowels
TLDR
A significant relationship was found between subjective word familiarity and AV and A-only (but not V-only) perception of non-native contrasts and overall performance was better in the AV and B-only conditions for the two groups.
The Effect of Visual Articulatory Information on the Neural Correlates of Non-native Speech Sound Discrimination
TLDR
Results showed that audio-visual speech training reduced the latency of the mismatch negativity (MMN) event-related potential but did not affect MMN amplitude, which suggests a single session of audio- visual speech training does not lead to the formation of more discrete memory traces for non-native speech sounds.
Effect of Multimodal Training on the Perception of French Nasal Vowels
This study investigates the effects of multimodal training and the contribution of facial cues on the perception of French nasal vowels by learners of French. Sixty American English learners of
Relative Contribution of Auditory and Visual Information to Mandarin Chinese Tone Identification by Native and Tone-naïve Listeners
TLDR
It is found that tone identification varies with individual tones, with tone 3 (the low-dipping tone) being the easiest one to identify, whereas tone 4 (the high-falling tone) was the most difficult one.
Exploring the role of hand gestures in learning novel phoneme contrasts and vocabulary in a second language
TLDR
The results suggest that hand gestures may not be well suited for learning novel phonetic distinctions at the syllable level within a word, and thus, gesture-speech integration may break down at the lowest levels of language processing and learning.
...
...

References

SHOWING 1-10 OF 81 REFERENCES
Bimodal Speech Perception by Native and Nonnative Speakers of English: Factors Influencing the McGurk Effect
This research provides a multidisciplinary perspective on the factors influencing the process of integrating auditory and visual information in speech perception, and the nature of the mental
Acquisition of second-language speech: Effects of visual cues, context, and talker variability
The influence of a talker's face (e.g., articulatory gestures) and voice, vocalic context, and word position were investigated in the training of Japanese and Korean English as a second language
Training English listeners to perceive phonemic length contrasts in Japanese.
TLDR
It is suggested that perceptual training improves non-native listeners' perception of Japanese length contrasts only to a limited extent and does not generalize to untrained contrast types.
Native and non‐native perception of phonemic length contrasts in Japanese: Effect of identification training and exposure
Japanese distinguishes between words by the presence or absence of several types of mora phonemes, often realized as a contrast in segment duration, e.g., /hato/ (pigeon) versus /hat:o/ (hat).
Training native English speakers to perceive Japanese length contrasts in word versus sentence contexts.
  • Y. Hirata
  • Linguistics, Psychology
    The Journal of the Acoustical Society of America
  • 2004
TLDR
An advantage of sentence training over word training was found: at the post-test, there was a greater difference between the scores of the two contexts for the word-training group than for the sentence- Training group, suggesting that learning in one context generalized to the other.
Training native English speakers to identify Japanese vowel length contrast with sentences at varied speaking rates.
TLDR
Native English speakers were trained to identify Japanese vowel length in three types of training differing in sentential speaking rate: slow- only, fast-only, and slow-fast, and effects for all three training types were found on the slow- and normal-rate test scores.
Linguistic experience and audio-visual perception of non-native fricatives.
TLDR
An integrated network in AV speech processing as a function of linguistic background is pointed to and evidence to extend auditory-based L2 speech learning theories to the visual domain is provided.
Cultural and linguistic factors in audiovisual speech processing: The McGurk effect in Chinese subjects
TLDR
The present study examined whether Chinese subjects would also show a reduced McGurk effect due to their cultural similarities with the Japanese, and showed that the magnitude of the McGurK effect depends on the length of time the Chinese subjects had lived in Japan.
Second-language spoken word identification: Effects of perceptual training, visual cues, and phonetic environment
  • D. Hardison
  • Psychology, Linguistics
    Applied Psycholinguistics
  • 2005
Experiments using the gating paradigm investigated the effects of auditory–visual (AV) and auditory-only perceptual training on second-language spoken-word identification by Japanese and Korean
...
...