Neural correlates of cross-modal binding

@article{Bushara2003NeuralCO,
  title={Neural correlates of cross-modal binding},
  author={Khalafalla O. Bushara and Takashi Hanakawa and Ilka Immisch and Keiichiro Toma and Kenji Kansaku and Mark Hallett},
  journal={Nature Neuroscience},
  year={2003},
  volume={6},
  pages={190-195}
}
Little is known about how the brain binds together signals from multiple sensory modalities to produce unified percepts of objects and events in the external world. Using event-related functional magnetic resonance imaging (fMRI) in humans, we measured transient brain responses to auditory/visual binding, as evidenced by a sound-induced change in visual motion perception. Identical auditory and visual stimuli were presented in all trials, but in some trials they were perceived to be bound… Expand

Figures, Tables, and Topics from this paper

Auditory thalamus integrates visual inputs into behavioral gains
TLDR
It is found that, in the rat thalamus, visual cues influence auditory responses, which have two distinct components: an early phasic one followed by a late gradual buildup that peaks before reward. Expand
Auditory processing in the posterior parietal cortex.
TLDR
The authors review recent studies that have focused on how neurons in the lateral intraparietal area (area LIP) differentially process auditory and visual stimuli and suggest that area LIP contains a modality-dependent representation that is highly dependent on behavioral context. Expand
Integration of visual and tactile signals from the hand in the human brain: an FMRI study.
TLDR
These results identify a set of candidate frontal, parietal and subcortical regions that integrate visual and tactile information for the multisensory perception of one's own hand. Expand
Anterior insula activations in perceptual paradigms: often observed but barely understood
TLDR
It is concluded that anterior insular cortex may be associated with perception in that it underpins heightened alertness of either stimulus- or task-driven origin, or both, and could integrate endogenous and exogenous functional demands under the joint criterion of whether they challenge an individual’s homeostasis. Expand
Perceptual Fusion and Stimulus Coincidence in the Cross-Modal Integration of Speech
TLDR
In an event-related functional magnetic resonance imaging study, neural systems that evaluate cross-modal coincidence of the physical stimuli from those that mediate perceptual binding are differentiated. Expand
Discrete neural substrates underlie complementary audiovisual speech integration processes
TLDR
Using fMRI, this work identified two anatomically distinct brain regions in the superior temporal cortex, one involved with processing temporal-synchrony, and one with processing perceptual fusion of audiovisual speech, which suggests that the inferior temporal cortex should be considered a "neuronal hub" composed of multiple discrete subregions that underlie an array of complementary low- and high-level multisensory integration processes. Expand
Cross-modal processing of auditory–visual stimuli in a no-task paradigm: A topographic event-related potential study
TLDR
The study provides evidence that several patterns of cross-modal interactions can be generated even if no task is required from subjects, and can be utilized for studying the maturation of the cross- modal processes in young children and in children with pathological development. Expand
Audio-Visual Perception of Everyday Natural Objects - Hemodynamic Studies in Humans
TLDR
This chapter summarizes and compares results from 49 paradigms published over the past decade that have explicitly examined human brain regions associated with audio-visual interactions, and meta-analysis results are discussed in the context of cognitive theories regarding how object knowledge representations may mesh with the multiple parallel pathways that appear to mediate audio- visual perception. Expand
Cross-modal binding and activated attentional networks during audio-visual speech integration: a functional MRI study.
TLDR
The results indicate a close relationship between cross- modal attentional control and cross-modal binding during speech reading and the posterior parietal cortex showed more activation during concordant than discordant stimuli, and hence was related to cross-Modal binding. Expand
Multimodal activity in the parietal cortex
  • Y. Cohen
  • Medicine, Psychology
  • Hearing Research
  • 2009
TLDR
Studies that have focused on how neurons in the lateral intraparietal area (area LIP) differentially process auditory and visual stimuli suggest that area LIP contains a modality-dependent representation that is highly dependent on behavioral context. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 19 REFERENCES
Neural Correlates of Auditory–Visual Stimulus Onset Asynchrony Detection
TLDR
It is proposed that the insula, via its known short-latency connections with the tectal system, mediates temporally defined auditory–visual interaction at an early stage of cortical processing permitting phenomena such as the ventriloquist and the McGurk illusions. Expand
Detection of Audio-Visual Integration Sites in Humans by Application of Electrophysiological Criteria to the BOLD Effect
TLDR
The efficacy of using an analytic approach informed by electrophysiology to identify multisensory integration sites in humans is demonstrated and the particular network of brain areas implicated in these crossmodal integrative processes are suggested to be dependent on the nature of the correspondence between the different sensory inputs. Expand
Cross-modal and cross-temporal association in neurons of frontal cortex
TLDR
It is concluded that prefrontal cortex neurons are part of integrative networks that represent behaviourally meaningful cross-modal associations and are crucial for the temporal transfer of information in the structuring of behaviour, reasoning and language. Expand
Modulation of human visual cortex by crossmodal spatial attention.
TLDR
Analysis of effective connectivity between brain areas suggests that touch influences unimodal visual cortex via back-projections from multimodal parietal areas, which provides a neural explanation for crossmodal links in spatial attention. Expand
Event-Related fMRI: Characterizing Differential Responses
TLDR
This paper focuses on bilateral ventrolateral prefrontal responses that show deactivations for previously seen words and activations for novel words in functional magnetic resonance imaging that are evoked by different sorts of stimuli. Expand
The Mind's Eye—Precuneus Activation in Memory-Related Imagery
TLDR
Memory-related imagery was associated with significant activation of a medial parietal area, the precuneus, which provides strong evidence that it is a key part of the neural substate of visual imagery occurring in conscious memory recall. Expand
Neural mechanisms for synthesizing sensory information and producing adaptive behaviors
  • B. Stein
  • Psychology, Medicine
  • Experimental Brain Research
  • 1998
TLDR
The most extensive physiological observations have been made in cat, and in this species the same principles that have been shown to govern multisensory integration at the level of the individual SC neuron have also been shows to govern overt attentive and orientation responses to mult isensory stimuli. Expand
The cost of expecting events in the wrong sensory modality
TLDR
The results show that stimulus-driven and expectancy-driven effects must be distinguished in studies of attending to different sensory modalities. Expand
Processing of auditory stimuli during auditory and visual attention as revealed by event-related potentials.
TLDR
Mismatch negativities (MMNs) were elicited by 1300-Hz and 1050-Hz deviants irrespective of whether they occurred among attended or unattended tones, supporting the proposal that the MMN is generated by an automatic cerebral discrimination process. Expand
Enhancement of selective listening by illusory mislocation of speech sounds due to lip-reading
TLDR
This work shows that crossmodal matching can lead to an illusion, whereby sounds are mislocated at their apparent visual source3, which can enhance selective spatial attention to speech sounds. Expand
...
1
2
...