Anthony P. Atkinson

Learn More
Research on emotion recognition has been dominated by studies of photographs of facial expressions. A full understanding of emotion perception and its neural substrate will require investigations that employ dynamic displays and means of expression other than the face. Our aims were: (i) to develop a set of dynamic and static whole-body expressions of basic(More)
Recent research has confirmed that individuals with autism spectrum disorder (ASD) have difficulties in recognizing emotions from body movements. Difficulties in perceiving coherent motion are also common in ASD. Yet it is unknown whether these two impairments are related. Thirteen adults with ASD and 16 age- and IQ-matched typically developing (TD) adults(More)
The importance of kinematics in emotion perception from body movement has been widely demonstrated. Evidence also suggests that the perception of biological motion relies to some extent on information about spatial and spatiotemporal form, yet the contribution of such form-related cues to emotion perception remains unclear. This study reports, for the first(More)
Basic emotional states (such as anger, fear, and joy) can be similarly conveyed by the face, the body, and the voice. Are there human brain regions that represent these emotional mental states regardless of the sensory cues from which they are perceived? To address this question, in the present study participants evaluated the intensity of emotions(More)
Emotionally expressive faces have been shown to modulate activation in visual cortex, including face-selective regions in ventral temporal lobe. Here, we tested whether emotionally expressive bodies similarly modulate activation in body-selective regions. We show that dynamic displays of bodies with various emotional expressions vs neutral bodies, produce(More)
(NCC) is burgeoning 1–3 (see Box 1), thanks in particular to the increased availability of brain imaging techniques. As a recent article points out 3 , the search for the NCC will undoubtedly increase our understanding of the neural bases of conscious experience. However, the emergence of methodologies that appeal simultaneously to computational modeling(More)
Face processing relies on a distributed, patchy network of cortical regions in the temporal and frontal lobes that respond disproportionately to face stimuli, other cortical regions that are not even primarily visual (such as somatosensory cortex), and subcortical structures such as the amygdala. Higher-level face perception abilities, such as judging(More)
Bilateral amygdala lesions impair the ability to identify certain emotions, especially fear, from facial expressions, and neuroimaging studies have demonstrated differential amygdala activation as a function of the emotional expression of faces, even under conditions of subliminal presentation, and again especially for fear. Yet the amygdala's role in(More)
Previous research with speeded-response interference tasks modeled on the Garner paradigm has demonstrated that task-irrelevant variations in either emotional expression or facial speech do not interfere with identity judgments, but irrelevant variations in identity do interfere with expression and facial speech judgments. Sex, like identity, is a(More)
Seven experiments investigated the finding that threatening schematic faces are detected more quickly than nonthreatening faces. Threatening faces with v-shaped eyebrows (angry and scheming expressions) were detected more quickly than nonthreatening faces with inverted v-shaped eyebrows (happy and sad expressions). In contrast to the hypothesis that these(More)