Neural correlates of cross-modal binding

  title={Neural correlates of cross-modal binding},
  author={Khalafalla O. Bushara and Takashi Hanakawa and Ilka Immisch and Keiichiro Toma and Kenji Kansaku and Mark Hallett},
  journal={Nature Neuroscience},
Little is known about how the brain binds together signals from multiple sensory modalities to produce unified percepts of objects and events in the external world. Using event-related functional magnetic resonance imaging (fMRI) in humans, we measured transient brain responses to auditory/visual binding, as evidenced by a sound-induced change in visual motion perception. Identical auditory and visual stimuli were presented in all trials, but in some trials they were perceived to be bound… 

Auditory thalamus integrates visual inputs into behavioral gains

It is found that, in the rat thalamus, visual cues influence auditory responses, which have two distinct components: an early phasic one followed by a late gradual buildup that peaks before reward.

Auditory processing in the posterior parietal cortex.

The authors review recent studies that have focused on how neurons in the lateral intraparietal area (area LIP) differentially process auditory and visual stimuli and suggest that area LIP contains a modality-dependent representation that is highly dependent on behavioral context.

Integration of visual and tactile signals from the hand in the human brain: an FMRI study.

These results identify a set of candidate frontal, parietal and subcortical regions that integrate visual and tactile information for the multisensory perception of one's own hand.

Perceptual Fusion and Stimulus Coincidence in the Cross-Modal Integration of Speech

In an event-related functional magnetic resonance imaging study, neural systems that evaluate cross-modal coincidence of the physical stimuli from those that mediate perceptual binding are differentiated.

Audio-Visual Perception of Everyday Natural Objects - Hemodynamic Studies in Humans

This chapter summarizes and compares results from 49 paradigms published over the past decade that have explicitly examined human brain regions associated with audio-visual interactions, and meta-analysis results are discussed in the context of cognitive theories regarding how object knowledge representations may mesh with the multiple parallel pathways that appear to mediate audio- visual perception.

Cross-modal binding and activated attentional networks during audio-visual speech integration: a functional MRI study.

The results indicate a close relationship between cross- modal attentional control and cross-modal binding during speech reading and the posterior parietal cortex showed more activation during concordant than discordant stimuli, and hence was related to cross-Modal binding.

Multimodal activity in the parietal cortex

  • Y. Cohen
  • Biology, Psychology
    Hearing Research
  • 2009

Multisensory Processing in Sensory-Specific Cortical Areas

  • E. Macaluso
  • Psychology, Biology
    The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry
  • 2006
Recent data indicating that the integration of multisensory signals relies not only on anatomical convergence from sensory- specific cortices to multi-sensory brain areas but also on reciprocal influences between cortical regions that are traditionally considered as sensory-specific are discussed.



Neural Correlates of Auditory–Visual Stimulus Onset Asynchrony Detection

It is proposed that the insula, via its known short-latency connections with the tectal system, mediates temporally defined auditory–visual interaction at an early stage of cortical processing permitting phenomena such as the ventriloquist and the McGurk illusions.

Detection of Audio-Visual Integration Sites in Humans by Application of Electrophysiological Criteria to the BOLD Effect

The efficacy of using an analytic approach informed by electrophysiology to identify multisensory integration sites in humans is demonstrated and the particular network of brain areas implicated in these crossmodal integrative processes are suggested to be dependent on the nature of the correspondence between the different sensory inputs.

Cross-modal and cross-temporal association in neurons of frontal cortex

It is concluded that prefrontal cortex neurons are part of integrative networks that represent behaviourally meaningful cross-modal associations and are crucial for the temporal transfer of information in the structuring of behaviour, reasoning and language.

Modulation of human visual cortex by crossmodal spatial attention.

Analysis of effective connectivity between brain areas suggests that touch influences unimodal visual cortex via back-projections from multimodal parietal areas, which provides a neural explanation for crossmodal links in spatial attention.

Event-Related fMRI: Characterizing Differential Responses

This paper focuses on bilateral ventrolateral prefrontal responses that show deactivations for previously seen words and activations for novel words in functional magnetic resonance imaging that are evoked by different sorts of stimuli.

The Mind's Eye—Precuneus Activation in Memory-Related Imagery

Memory-related imagery was associated with significant activation of a medial parietal area, the precuneus, which provides strong evidence that it is a key part of the neural substate of visual imagery occurring in conscious memory recall.

Neural mechanisms for synthesizing sensory information and producing adaptive behaviors

  • B. Stein
  • Biology, Psychology
    Experimental Brain Research
  • 1998
The most extensive physiological observations have been made in cat, and in this species the same principles that have been shown to govern multisensory integration at the level of the individual SC neuron have also been shows to govern overt attentive and orientation responses to mult isensory stimuli.

The cost of expecting events in the wrong sensory modality

The results show that stimulus-driven and expectancy-driven effects must be distinguished in studies of attending to different sensory modalities.

Processing of auditory stimuli during auditory and visual attention as revealed by event-related potentials.

Mismatch negativities (MMNs) were elicited by 1300-Hz and 1050-Hz deviants irrespective of whether they occurred among attended or unattended tones, supporting the proposal that the MMN is generated by an automatic cerebral discrimination process.

Enhancement of selective listening by illusory mislocation of speech sounds due to lip-reading

This work shows that crossmodal matching can lead to an illusion, whereby sounds are mislocated at their apparent visual source3, which can enhance selective spatial attention to speech sounds.