Seeing sounds: visual and auditory interactions in the brain

  title={Seeing sounds: visual and auditory interactions in the brain},
  author={David A. Bulkin and Jennifer M. Groh},
  journal={Current Opinion in Neurobiology},
Processing Streams in Auditory Cortex
Besides processing of space and motion, the dorsal stream also participates in other important forms of audio-motor behavior, including sensorimotor control and integration for speech and music in humans.
Visual influences on auditory spatial learning
  • A. King
  • Psychology, Biology
    Philosophical Transactions of the Royal Society B: Biological Sciences
  • 2008
Early multisensory experience appears to be crucial for the emergence of an ability to match signals from different sensory modalities and therefore for the outcome of audiovisual-based rehabilitation of deaf patients in whom hearing has been restored by cochlear implantation.
Locating the sources for cross-modal interactions and decision making during judging the visual-affected auditory intensity change
Behavioral results demonstrated that incongruent audiovisual change could result in the illusory perception of the change in sound intensity, with the involvement of insula, agranular retrolimbic, premotor cortex and caudate nucleus.
Audio-Visual Perception of Everyday Natural Objects - Hemodynamic Studies in Humans
This chapter summarizes and compares results from 49 paradigms published over the past decade that have explicitly examined human brain regions associated with audio-visual interactions, and meta-analysis results are discussed in the context of cognitive theories regarding how object knowledge representations may mesh with the multiple parallel pathways that appear to mediate audio- visual perception.
The ERP Signal Analysis of Visual Influences on Auditory Intensity Perception
Pervious studies have reported the supramodal interactions in different sensory modalities. The aim of this paper is to investigate the influences of visual spatial sensory on the auditory intensity
Visuo-auditory interactions in the primary visual cortex of the behaving monkey: Electrophysiological evidence
The data show that single neurons from a primary sensory cortex such as V1 can integrate sensory information of a different modality, a result that argues against a strict hierarchical model of multisensory integration.
Visual Activation and Audiovisual Interactions in the Auditory Cortex during Speech Perception: Intracranial Recordings in Humans
These findings demonstrate that audiovisual speech integration does not respect the classical hierarchy from sensory-specific to associative cortical areas, but rather engages multiple cross-modal mechanisms at the first stages of nonprimary auditory cortex activation.
Chapter 2 Processing Streams in Auditory Cortex
Y.E. Cohen et al. (eds.), Neural Correlates of Auditory Cognition, Springer Handbook of Auditory Research 45, DOI 10.1007/978-1-4614-2350-8_2, © Springer Science+Business Media New York 2013
Poorer auditory sensitivity is related to stronger visual enhancement of the human auditory mismatch negativity (MMNm)
Findings show that discrimination of even a sensory-specific feature as pitch is facilitated by multisensory information at a pre-attentive level, and they highlight the importance of considering inter-individual differences in uni-sensory abilities when assessing mult isensory processing.


Gated Visual Input to the Central Auditory System
It is shown that strong visual responses, which are appropriate to guide auditory plasticity, appear in the ICX when inhibition is blocked in the optic tectum, indicating that visual spatial information is gated into the auditory system by an inhibitory mechanism that operates at a higher level in the brain.
Illusions: What you see is what you hear
It is shown that auditory information can qualitatively alter the perception of an unambiguous visual stimulus to create a striking visual illusion, indicating that visual perception can be manipulated by other sensory modalities.
The spread of attention across modalities and space in a multisensory object.
Investigating whether visual attention can modulate neural responses to other components of a multisensory object defined by synchronous, but spatially disparate, auditory and visual stimuli found that the brain's response to task-irrelevant sounds occurring synchronously with a visual stimulus from a different location was larger when that accompanying visual stimulus was attended versus unattended.
Bimodal speech: early suppressive visual effects in human auditory cortex
Bimodal syllables were identified more rapidly than auditory alone stimuli and the latency of the effect indicates that integration operates at pre‐representational stages of stimulus analysis, probably via feedback projections from visual and/or polymodal areas.
Visual stimuli activate auditory cortex in the deaf
It is found that deaf subjects exhibit activation in a region of the right auditory cortex, corresponding to Brodmann's areas 42 and 22, as well as in area 41 (primary auditory cortex), demonstrating that early deafness results in the processing of visual stimuli in auditory cortex.
Auditory influences on visual temporal rate perception.
  • G. Recanzone
  • Psychology, Biology
    Journal of neurophysiology
  • 2003
The results show that the auditory system can strongly influence visual perception and are consistent with the idea that bimodal sensory conflicts are dominated by the sensory system with the greater acuity for the stimulus parameter being discriminated.
Sensorimotor integration in the primate superior colliculus. II. Coordinates of auditory signals.
The present experiment tested two alternative hypotheses concerning the frame of reference of auditory signals found in the deeper layers of the superior colliculus and found that, with the head stationary, the response of auditory neurons will not be affected by variations in eye position but will be determined by the location of the sound source.
Multisensory Integration of Dynamic Faces and Voices in Rhesus Monkey Auditory Cortex
It is demonstrated unequivocally that the primate auditory cortex integrates facial and vocal signals through enhancement and suppression of field potentials in both the core and lateral belt regions of the auditory cortex in awake behaving rhesus monkeys.
Eye-centered, head-centered, and complex coding of visual and auditory targets in the intraparietal sulcus.
The results are interpreted as suggesting that although the representations of space in areas LIP and MIP are not easily described within the conventional conceptual framework of reference frames, they nevertheless process visual and auditory spatial information in a similar fashion.