Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study.

@article{Molholm2002MultisensoryAI,
  title={Multisensory auditory-visual interactions during early sensory processing in humans: a high-density electrical mapping study.},
  author={S. Molholm and W. Ritter and M. Murray and D. Javitt and C. Schroeder and J. Foxe},
  journal={Brain research. Cognitive brain research},
  year={2002},
  volume={14 1},
  pages={
          115-28
        }
}
Integration of information from multiple senses is fundamental to perception and cognition, but when and where this is accomplished in the brain is not well understood. This study examined the timing and topography of cortical auditory-visual interactions using high-density event-related potentials (ERPs) during a simple reaction-time (RT) task. Visual and auditory stimuli were presented alone and simultaneously. ERPs elicited by the auditory and visual stimuli when presented alone were summed… Expand
Cross-modal processing of auditory–visual stimuli in a no-task paradigm: A topographic event-related potential study
TLDR
The study provides evidence that several patterns of cross-modal interactions can be generated even if no task is required from subjects, and can be utilized for studying the maturation of the cross- modal processes in young children and in children with pathological development. Expand
Early Cross-Modal Interactions in Auditory and Visual Cortex Underlie a Sound-Induced Visual Illusion
TLDR
Evidence is provided that perception of the illusory second flash is based on a very rapid dynamic interplay between auditory and visual cortical areas that is triggered by the second sound. Expand
Multisensory visual-auditory object recognition in humans: a high-density electrical mapping study.
Multisensory object-recognition processes were investigated by examining the combined influence of visual and auditory inputs upon object identification--in this case, pictures and vocalizations ofExpand
Multisensory processing and oscillatory gamma responses: effects of spatial selective attention
TLDR
An EEG study investigating the interactions between multisensory (audio-visual) integration and spatial attention, using oscillatory gamma-band responses (GBRs) found that attention particularly affects the integrative processing of audiovisual stimuli at these early latencies. Expand
Ipsilateral visual and auditory spatial information interaction mechanisms in selective attention conditions: Behavioral and ERP study
We used event-related potentials (ERPs) to evaluate the neural mechanism of which auditory spatial information affect audiovisual integration in visual attention task. Some previous studies showedExpand
Multisensory processes of auditory–visual stimuli onset asynchrony in human cortex
Abstract Although the previous study in animal superior colliculus showed several multisensory integrative principles, the extent to these multisensory mechanisms generalize to cortical processesExpand
Modality shift effects mimic multisensory interactions: an event-related potential study
TLDR
The influence of MSEs on auditory-visual interactions is tested by comparing the results of AV − (A + V) using (a) all stimuli and using (b) only ipsimodal stimuli, and it is formally and empirically demonstrated that (T  + TAV) −-(TA +-TV) is robust against possible biases due to the MSE. Expand
Early modulation of visual cortex by sound: an MEG study
TLDR
Results indicate that the auditory alteration of visual perception as reflected by the illusion is associated with modulation of activity in visual cortex, and suggests that a feed-forward or lateral circuitry is at least partially involved in these interactions. Expand
Intracranial Cortical Responses during Visual–Tactile Integration in Humans
TLDR
Results tend to support earlier concepts of multisensory integration as relatively late and centered in tertiary multimodal association cortices, and suggest two neurophysiologically distinct and temporally separated integration mechanisms in TPOJ. Expand
Visual modulation of neurons in auditory cortex.
TLDR
It is demonstrated that visual stimuli can modulate the firing of neurons in auditory cortex in a manner that depends on stimulus efficacy and timing, and these neurons meet the criteria for sensory integration and provide the auditory modality with multisensory contextual information about co-occurring environmental events. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 129 REFERENCES
Auditory-Visual Integration during Multimodal Object Recognition in Humans: A Behavioral and Electrophysiological Study
TLDR
The results indicate that multisensory integration is mediated by flexible, highly adaptive physiological processes that can take place very early in the sensory processing chain and operate in both sensory-specific and nonspecific cortical structures in different ways. Expand
Multisensory auditory-somatosensory interactions in early cortical processing revealed by high-density electrical mapping.
TLDR
Both the topography and timing of these interactions are consistent with multisensory integration early in the cortical processing hierarchy, in brain regions traditionally held to be un isensory. Expand
Visuo-spatial neural response interactions in early cortical processing during a simple reaction time task: a high-density electrical mapping study
TLDR
It is found that probability summation could fully account for the redundant target effect (RTE) in each visual field, and generalizes the RTE to unilateral stimulus pairs. Expand
An analysis of audio-visual crossmodal integration by means of event-related potential (ERP) recordings.
TLDR
It appears that the very early interaction effects can be largely accounted for by an anticipatory ERP that precedes both the unimodal and bimodal stimuli. Expand
Detection of Audio-Visual Integration Sites in Humans by Application of Electrophysiological Criteria to the BOLD Effect
TLDR
The efficacy of using an analytic approach informed by electrophysiology to identify multisensory integration sites in humans is demonstrated and the particular network of brain areas implicated in these crossmodal integrative processes are suggested to be dependent on the nature of the correspondence between the different sensory inputs. Expand
Involuntary Listening Aids Seeing: Evidence From Human Electrophysiology
TLDR
It is found that orienting spatial attention to an irrelevant sound modulates the ERP to a subsequent visual target over modality-specific, extrastriate visual cortex, but only after the initial stages of sensory processing are completed. Expand
Behavioral Indices of Multisensory Integration: Orientation to Visual Cues is Affected by Auditory Stimuli
TLDR
The present experiments illustrate that the rules governing multisensory integration at the level of the single cell also predict responses to these stimuli in the intact behaving organism. Expand
Topographic analysis of auditory event-related potentials.
TLDR
Topographic studies of some of the more prominent auditory ERP components support the conclusion that the obligatory AEP components up to 200msec in latency are modality specific in topography and are consistent with sources principally within the superior temporal plane. Expand
Auditory-visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey
TLDR
The results indicate that neurons in both regions show auditory-visual interactions and that at least some of these interactions are due to convergence at the cortical cell. Expand
Response amplification in sensory-specific cortices during crossmodal binding.
TLDR
FMRI data suggest that the perceptual improvements effected by synthesizing matched multisensory inputs are realised by reciprocal amplification of the signal intensity in participating unimodal cortices. Expand
...
1
2
3
4
5
...