• Corpus ID: 142698846

Signal compatibility as a modulatory factor for audiovisual multisensory integration.

  title={Signal compatibility as a modulatory factor for audiovisual multisensory integration.},
  author={Cesare V. Parise},
  • C. Parise
  • Published 1 September 2013
  • Psychology
The physical properties of the signals activating our senses are often correlated in nature; it would therefore be advantageous to exploit such correlations to better process sensory information. Stimulus correlations can be contingent and readily available to the senses (like the temporal correlation between mouth movements and vocal sounds in speech), or can be the results of the statistical co-occurrence of certain stimulus properties that can be learnt over time (like the relation between… 
2 Citations
Audio-visual synchrony and spatial attention enhance processing of dynamic visual stimulation independently and in parallel: a frequency-tagging study
The neural processing of a visual stimulus can be facilitated by attending to its position or by a co-occurring auditory tone. Using frequency-tagging we investigated whether facilitation by spatial
Audio-visual synchrony and spatial attention enhance processing of dynamic visual stimulation independently and in parallel: A frequency-tagging study
Unique patterns of effects on pulse‐ and flicker driven SSRs indicated that spatial attention and audiovisual synchrony facilitated early visual processing in parallel and via different cortical processes, finding attention effects to resemble the classical top‐down gain effect facilitating both, flicker and pulse‐driven SSRs.


Audiovisual multisensory integration
Over the last 50 years or so, a large body of empirical research has demonstrated the importance of a variety of low-level spatiotemporal factors in the multisensory integration of auditory and
Facilitation of multisensory integration by the "unity effect" reveals that speech is special.
The unity effect only influenced participants' performance for the speech stimuli; no effect was observed for monkey vocalizations or for the human imitations of monkey calls, suggesting that the facilitation of multisensory integration by the unity effect is specific to human speech signals.
Natural cross-modal mappings between visual and auditory features.
In a series of speeded classification tasks, spontaneous mappings between the auditory feature of pitch and the visual features of vertical location, size, and spatial frequency are found but not contrast.
Stroop Interference Based on the Synaesthetic Qualities of Auditory Pitch
The results confirm that synaesthetic qualities of pitch are rapidly and automatically encoded and that the products of this encoding automatically interact with the mechanisms responsible for identifying word meaning and/or with the post-identification decision processes.
Cross-Modal Processing in Early Visual and Auditory Cortices depends on Expected Statistical Relationship of Multisensory Information
This work shows that the sign of cross-modal interactions depends on whether the content of two modalities is associated or not, and illustrates an ecologically optimal flexibility of the neural mechanisms that govern multisensory processing.
Recalibration of audiovisual simultaneity
It is reported that after exposure to a fixed audiovisual time lag for several minutes, human participants showed shifts in their subjective simultaneity responses toward that particular lag, suggesting that the brain attempts to adjust subjective simultaneousity across different modalities by detecting and reducing time lags between inputs that likely arise from the same physical events.
The ventriloquist in motion: illusory capture of dynamic information across sensory modalities.
The results of the present study show that the perceived direction of auditory apparent motion is strongly modulated by apparent motion in vision, and that both spatial and temporal factors play a significant role in this crossmodal effect.
Semantics and the multisensory brain: How meaning modulates processes of audio-visual integration
This paper reviews work related to the question for which tasks the influence of semantic factors has been found and which cortical networks are most likely to mediate these effects and investigates which cortical regions are particularly responsive to experimental variations of content.
Synesthetic congruency modulates the temporal ventriloquism effect
It is demonstrated that the synesthetic congruency between the auditory and visual stimuli (in particular, between the relative pitch of the sounds and the relative size of the visual stimuli) can modulate the magnitude of this multisensory integration effect.
Neural Correlates of Auditory–Visual Stimulus Onset Asynchrony Detection
It is proposed that the insula, via its known short-latency connections with the tectal system, mediates temporally defined auditory–visual interaction at an early stage of cortical processing permitting phenomena such as the ventriloquist and the McGurk illusions.