Anthony P. Atkinson

Learn More
Research on emotion recognition has been dominated by studies of photographs of facial expressions. A full understanding of emotion perception and its neural substrate will require investigations that employ dynamic displays and means of expression other than the face. Our aims were: (i) to develop a set of dynamic and static whole-body expressions of basic(More)
Disproportionate inversion decrements for recognizing faces and other homogeneous stimuli are often interpreted as evidence that experts use relational features to recognize stimuli that share a configuration. However, it has never directly been shown that inversion disrupts the coding of relational features more than isolated features. Here we report three(More)
Basic emotional states (such as anger, fear, and joy) can be similarly conveyed by the face, the body, and the voice. Are there human brain regions that represent these emotional mental states regardless of the sensory cues from which they are perceived? To address this question, in the present study participants evaluated the intensity of emotions(More)
Recent research has confirmed that individuals with autism spectrum disorder (ASD) have difficulties in recognizing emotions from body movements. Difficulties in perceiving coherent motion are also common in ASD. Yet it is unknown whether these two impairments are related. Thirteen adults with ASD and 16 age- and IQ-matched typically developing (TD) adults(More)
Face processing relies on a distributed, patchy network of cortical regions in the temporal and frontal lobes that respond disproportionately to face stimuli, other cortical regions that are not even primarily visual (such as somatosensory cortex), and subcortical structures such as the amygdala. Higher-level face perception abilities, such as judging(More)
The importance of kinematics in emotion perception from body movement has been widely demonstrated. Evidence also suggests that the perception of biological motion relies to some extent on information about spatial and spatiotemporal form, yet the contribution of such form-related cues to emotion perception remains unclear. This study reports, for the first(More)
BACKGROUND Previous behavioural and neuroimaging studies of emotion processing in autistic spectrum disorder (ASD) have focused on the use of facial stimuli. To date, however, no studies have examined emotion processing in autism across a broad range of social signals. METHOD This study addressed this issue by investigating emotion processing in a group(More)
Seven experiments investigated the finding that threatening schematic faces are detected more quickly than nonthreatening faces. Threatening faces with v-shaped eyebrows (angry and scheming expressions) were detected more quickly than nonthreatening faces with inverted v-shaped eyebrows (happy and sad expressions). In contrast to the hypothesis that these(More)
Emotionally expressive faces have been shown to modulate activation in visual cortex, including face-selective regions in ventral temporal lobe. Here, we tested whether emotionally expressive bodies similarly modulate activation in body-selective regions. We show that dynamic displays of bodies with various emotional expressions vs neutral bodies, produce(More)
Previous research with speeded-response interference tasks modeled on the Garner paradigm has demonstrated that task-irrelevant variations in either emotional expression or facial speech do not interfere with identity judgments, but irrelevant variations in identity do interfere with expression and facial speech judgments. Sex, like identity, is a(More)