Neural computation as a tool to differentiate perceptual from emotional processes: The case of anger superiority effect

@article{Mermillod2009NeuralCA,
  title={Neural computation as a tool to differentiate perceptual from emotional processes: The case of anger superiority effect},
  author={Martial Mermillod and Nicolas Vermeulen and Daniel Lundqvist and Paula M. Niedenthal},
  journal={Cognition},
  year={2009},
  volume={110},
  pages={346-357}
}

Figures and Tables from this paper

Anger superiority effect: The importance of dynamic emotional facial expressions
A rapid response to a threatening face in a crowd is important to successfully interact in social environments. Visual search tasks have been employed to determine whether there is a processing
Desperately seeking friends: How expectation of punishment modulates attention to angry and happy faces
ABSTRACT In the literature, a well-known processing advantage for angry schematic faces was largely observed in the “Face in the Crowd” (FIC) visual search task. A debate about automaticity and
Bodily Information and Top-Down Affective Priming Jointly Affect the Processing of Fearful Faces
TLDR
The findings promisingly indicate that the processing of fearful faces is jointly influenced by both bottom-up interoceptive states and top-down affective primes that are congruent with the emotion.
Attentional biases using the body in the crowd task: Are angry body postures detected more rapidly?
TLDR
These findings are the first to establish threat biases using body postures in a visual search paradigm and demonstrate evidence of delayed disengagement from threat.
Recognition advantage of happy faces in extrafoveal vision: Featural and affective processing
Happy, surprised, disgusted, angry, sad, fearful, and neutral facial expressions were presented extrafoveally (2.5° away from fixation) for 150 ms, followed by a probe word for recognition
On the flexibility of social source memory: a test of the emotional incongruity hypothesis.
TLDR
Focusing on expectancy-incongruent information may represent a more efficient, general, and hence more adaptive memory strategy for remembering exchange-relevant information than focusing only on cheaters.
Attentional bias during emotional processing: Behavioral and electrophysiological evidence from an Emotional Flanker Task
TLDR
The observed spatiotemporal dynamics seem to concur with understanding of the important adaptive role attributed to threat-related attention bias, with behavioural and ERP responses drawn from four task conditions accompanied by significant modulations of ERP activity across all time-windows and regions of interest and displayed some meaningful correlations.
The Role of Emotional Content and Perceptual Saliency During the Programming of Saccades Toward Faces
TLDR
It is suggested that there is no automatic prioritization of emotional faces, at least for saccades with short latencies, but that salient local face features can automatically attract attention.
Emotional Modulation of Attention: Fear Increases but Disgust Reduces the Attentional Blink
TLDR
It is found that processing fear faces impaired the detection of T2 to a greater extent than did the processing disgust faces, which implies emotion-specific modulation of attention.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 30 REFERENCES
Looking for foes and friends: perceptual and emotional factors when finding a face in the crowd.
TLDR
In a face-in-the-crowd setting, visual search for photographically reproduced happy, angry, and fearful target faces among neutral distractor faces was examined and angry advantage was most obvious for highly socially anxious individuals when their social fear was experimentally enhanced.
EMPATH: A Neural Network that Categorizes Facial Expressions
TLDR
This article shows that a simple yet biologically plausible neural network model, trained to classify facial expressions into six basic emotions, predicts data used to support both of these theories of facial expression recognition.
Finding the face in the crowd: an anger superiority effect.
TLDR
Three experiments documented an asymmetry in the processing of emotionally discrepant faces embedded in crowds, suggesting that threatening faces pop out of crowds, perhaps as a result of a preattentive, parallel search for signals of direct threat.
The importance of low spatial frequency information for recognising fearful facial expressions
TLDR
The statistical analysis of the statistical properties of LSF compared with HSF and intact faces shows that the LSF components in faces, which are typically extracted rapidly by the visual system, provide a better source of information than HSF components for the correct categorisation of fearful expressions in faces.
Organization of face and object recognition in modular neural network models
The face in the crowd revisited: a threat advantage with schematic stimuli.
TLDR
Threatening angry faces were more quickly and accurately detected than were other negative faces (sad or "scheming"), which suggests that the threat advantage can be attributed to threat rather than to the negative valence or the uniqueness of the target display.
It Takes a Confounded Face to Pop Out of a Crowd
TLDR
The angry face in Hansen and Hansen's experiments may have popped out from a crowd of happy faces because of a contrast artifact inadvertently introduced when they created their stimuli.
Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies.
TLDR
Facial expressions and emotion labels are probably associated, but the association may vary with culture and is loose enough to be consistent with various alternative accounts, 8 of which are discussed.
More about the Difference between Men and Women: Evidence from Linear Neural Networks and the Principal-Component Approach
TLDR
It is shown that performance comparable to that of the measurement-based models can be achieved with pixel-based input when the data are preprocessed, and that, although the hair contributes to the sex-classification process, it is not the only important contributor.
...
1
2
3
...