Corpus ID: 235446577

Toward Affective XAI: Facial Affect Analysis for Understanding Explainable Human-AI Interactions

  title={Toward Affective XAI: Facial Affect Analysis for Understanding Explainable Human-AI Interactions},
  author={Luke M. Guerdan and Alex Raymond and H. Gunes},
As machine learning approaches are increasingly used to augment human decision-making, eXplainable Artificial Intelligence (XAI) research has explored methods for communicating system behavior to humans. However, these approaches often fail to account for the emotional responses of humans as they interact with explanations. Facial affect analysis, which examines human facial expressions of emotions, is one promising lens for understanding how users engage with explanations. Therefore, in this… Expand

Figures and Tables from this paper


Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications
  • R. Calvo, S. D'Mello
  • Psychology, Computer Science
  • IEEE Transactions on Affective Computing
  • 2010
This survey explicitly explores the multidisciplinary foundation that underlies all AC applications by describing how AC researchers have incorporated psychological theories of emotion and how these theories affect research questions, methods, results, and their interpretations. Expand
Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space
This paper proposes an output-associative fusion framework that incorporates correlations and covariances between the emotion dimensions and shows that on average, BLSTM-NNs outperform SVR due to their ability to learn past and future context, and the proposed system is well able to reproduce the valence and arousal ground truth obtained from human coders. Expand
The role of emotion in self-explanations by cognitive agents
It is argued that emotions simulated based on cognitive appraisal theory enable the explanation of these emotions, using them as a heuristic to identify important beliefs and desires for the explanation, and the use of emotion words in the explanations themselves. Expand
Inter-rater reliability for emotion annotation in human–computer interaction: comparison and methodological improvements
This paper investigates the achieved inter-rater agreement utilizing Krippendorff’s alpha for emotional annotated interaction corpora and presents methods to improve the reliability, showing that the reliabilities obtained with different methods does not differ much, so a choice could rely on other aspects. Expand
Towards Understanding Emotional Experience in a Componential Framework
A componential framework with a data-driven approach to characterise emotional experiences evoked during movie watching suggests that differences between various emotions can be captured by a few latent dimensions, each defined by features associated with component processes. Expand
Reliable facial muscle activation enhances recognizability and credibility of emotional expression.
It is concluded that reliable AUs may indeed convey trustworthy information about emotional processes but that most of these AUs are likely to be shared by several emotions rather than providing information about specific emotions. Expand
The handbook of emotion elicitation and assessment
I EMOTION ELICITATION 1. Emotion Elicitation Using Films 2. The International Affective Picture System (IAPS) in the Study of Emotion and Attention 3. The Directed Facial Action Task: EmotionalExpand
Observer-based measurement of facial expression with the Facial Action Coding System.
FACS is regarded by many as the standard measure for facial behavior and is used widely in diverse fields and beyond emotion science, these include facial neuromuscular disorders. Expand
AFEW-VA database for valence and arousal estimation in-the-wild
A new dataset of highly accurate per-frame annotations of valence and arousal for 600 challenging video clips extracted from feature films (also used in part for the AFEW dataset) is proposed and results show that geometric features perform well independently of the settings. Expand
The Identification of Unfolding Facial Expressions
The results show that a reliable response is possible long before the full FE configuration is reached, and suggests that identification is reached by integrating in time individual diagnostic facial actions, and does not require perceiving the full apex configuration. Expand