Insensitivity to Fearful Emotion for Early ERP Components in High Autistic Tendency Is Associated with Lower Magnocellular Efficiency
Multisensory integration is ubiquitous, facilitating perception beyond the limit of individual senses. This mechanism is especially salient when individual sensory input is weak (i.e., the principle of inverse effectiveness), fusing subthreshold cues into tangible percepts. Nevertheless, it is unclear how this rule applies to threat perception, synthesizing elusive, discrete traces of a threat into a discernible danger signal. In light of hemispheric asymmetry in threat processing, we combined parafoveal stimulus presentation and the contralateral P1 visual event-related potential to investigate how aversive olfactory inputs enhance visual perception of highly degraded, subthreshold fearful expressions. The dominant right hemisphere exhibited early visual discrimination between subtle fear and neutral expressions, independently of accompanying odors. In the left hemisphere, differential visual processing occurred only at the convergence of negative odors and minute facial fear, highlighting the success and necessity of visuo-olfactory threat integration in this disadvantaged hemisphere. Reaction time data from a subsequent dot-detection task complemented these neural findings, revealing odor-dependent and hemisphere-specific modulation of spatial attention to facial expressions. Our evidence thus indicates cross-modal threat integration in basic visual perception in humans that captures minimal threat information, especially in the blind right hemifield. Critically, this interaction between multisensory synergy and hemispheric asymmetry in threat perception may underlie the multifaceted fear experiences of everyday life.