Do You See What I'm Saying? Interactions between Auditory and Visual Cortices in Cochlear Implant Users

@article{Zatorre2001DoYS,
  title={Do You See What I'm Saying? Interactions between Auditory and Visual Cortices in Cochlear Implant Users},
  author={Robert J. Zatorre},
  journal={Neuron},
  year={2001},
  volume={31},
  pages={13-14}
}
Visual Cortex Activity in Early and Late Blind People
  • H. Burton
  • Biology, Psychology
    The Journal of Neuroscience
  • 2003
Brain imaging studies describe visual cortex activity in blind people during nonvisual tasks such as Braille reading, hearing words, or sensory discriminations of tactile or auditory stimuli [for
A Multisensory Cortical Network for Understanding Speech in Noise
TLDR
Using functional magnetic resonance imaging, it is shown that understanding speech-in-noise is supported by a network of brain areas including the left superior parietal lobule, the motor/premotor cortex, and the left anterior superior temporal sulcus (STS), a likely apex of the acoustic processing hierarchy.
The Effect of Visual Cues on Difficulty Ratings for Segregation of Musical Streams in Listeners with Impaired Hearing
TLDR
Simple visual cues may improve the ability of cochlear implant users to segregate lines of music, thus potentially increasing their enjoyment of music.
Adaptive changes in early and late blind: a FMRI study of verb generation to heard nouns.
TLDR
The results confirm the presence of adaptations in visual cortex of blind people but argue against the notion that this activity during Braille reading represents somatosensory (haptic) processing.
Dissociating cortical regions activated by semantic and phonological tasks: a FMRI study in blind and sighted people.
TLDR
The results confirmed the presence of adaptations in the visual cortex of blind people and suggested that these responses represented linguistic operations, and supported prior evidence of visual cortex activity in blind people engaged in auditory language processing.
Cognition in the hearing impaired and deaf as a bridge between signal and dialogue: a framework and a model
  • J. Rönnberg
  • Psychology
    International journal of audiology
  • 2003
TLDR
A working-memory framework is proposed for the cognitive involvement in language understanding (sign and speech) and four important parameters for language understanding are described in some detail: quality and precision of phonology, long-term memory access speed, degree of explicit processing, and general processing and storage capacity.
Balance sensory organization in children with profound hearing loss and cochlear implants.
Reading embossed capital letters: An fMRI study in blind and sighted individuals
TLDR
Results show cross‐modal reorganization of visual cortex and altered response dynamics in nonvisual areas that plausibly reflect mechanisms for adaptive plasticity in blindness.
...
...

References

SHOWING 1-10 OF 15 REFERENCES
Activation of auditory cortex during silent lipreading.
TLDR
Three experiments suggest that these auditory cortical areas are not engaged when an individual is viewing nonlinguistic facial movements but appear to be activated by silent meaningless speechlike movements (pseudospeech), which supports psycholinguistic evidence that seen speech influences the perception of heard speech at a prelexical stage.
Hearing lips and seeing voices
TLDR
The study reported here demonstrates a previously unrecognised influence of vision upon speech perception, on being shown a film of a young woman's talking head in which repeated utterances of the syllable [ba] had been dubbed on to lip movements for [ga].
Hearing in the Mind's Ear: A PET Investigation of Musical Imagery and Perception
TLDR
Both perceiving and imagining songs are associated with bilateral neuronal activity in the secondary auditory cortices, suggesting that processes within these regions underlie the phenomenological impression of imagined sounds.
Speech-like cerebral activity in profoundly deaf people processing signed languages: implications for the neural basis of human language.
TLDR
It is hypothesized that the neural tissue involved in language processing may not be prespecified exclusively by sensory modality but may entail polymodal neural tissue that has evolved unique sensitivity to aspects of the patterning of natural language.
Compensatory plasticity and sensory substitution in the cerebral cortex
Cross-modal reorganization of human cortical functions
The role of area 17 in visual imagery: convergent evidence from PET and rTMS.
TLDR
PET results showed that when patterns of stripes are visualized, Area 17 is activated, and the rTMS results show that such activation underlies information processing.
Neural mechanisms underlying melodic perception and memory for pitch
TLDR
It is concluded that specialized neural systems in the right superior temporal cortex participate in perceptual analysis of melodies; pitch comparisons are effected via a neural network that includes right prefrontal cortex, but active retention of pitch involves the interaction of right temporal and frontal cortices.
...
...