• Corpus ID: 233181838

Py-Feat: Python Facial Expression Analysis Toolbox

@article{Cheong2021PyFeatPF,
  title={Py-Feat: Python Facial Expression Analysis Toolbox},
  author={J. H. Cheong and Tiankang Xie and Sophie Byrne and Luke J. Chang},
  journal={ArXiv},
  year={2021},
  volume={abs/2104.03509}
}
Studying facial expressions is a notoriously difficult endeavor. Recent advances in the field of affective computing have yielded impressive progress in automatically detecting facial expressions from pictures and videos. However, much of this work has yet to be widely disseminated in social science domains such as psychology. Current state of the art models require considerable domain expertise that is not traditionally incorporated into social science training programs. Furthermore, there is… 
5 Citations

Figures and Tables from this paper

Tensor-based Emotion Editing in the StyleGAN Latent Space
TLDR
It is concluded that the tensor-based model is well suited for emotion and yaw editing, i.e., that the emotion or yaw rotation of a novel face image can be robustly changed without a significant effect on identity or other attributes in the images.
Viewpoint Robustness of Automated Facial Action Unit Detection Systems
Automatic facial action detection is important, but no previous studies have evaluated pre-trained models on the accuracy of facial action detection as the angle of the face changes from frontal to
Recovering individual emotional states from sparse ratings using collaborative filtering
TLDR
This work introduces and validate a new approach in which responses are sparsely sampled and the missing data are recovered using a computational technique known as collaborative filtering (CF), which leverages structured covariation across individual experiences and is available in Neighbors, an open-source Python toolbox.
Taking a Computational Cultural Neuroscience Approach to Study Parent-Child Similarities in Diverse Cultural Contexts
TLDR
An interdisciplinary computational cultural neuroscience approach is introduced, which utilizes computational methods to understand neural and psychological processes being involved during parent-child interactions at intra- and inter-personal level to gain a better understanding of cultural transmission processes.
Assessing Automated Facial Action Unit Detection Systems for Analyzing Cross-Domain Facial Expression Databases
TLDR
This study compared the performance of three systems that detect each facial movement corresponding to an action unit (AU) derived from the Facial Action Coding System and demonstrated the features of prediction patterns for each system and provide guidance for research on facial expressions.

References

SHOWING 1-10 OF 102 REFERENCES
Regret Induces Rapid Learning from Experience-based Decisions: A Model-based Facial Expression Analysis Approach
Regret—an emotion comprised of both a counterfactual, cognitive component and a negative, affective component—is one of the most commonly experienced emotions involved in decision making. For
Facial Expression Recognition Using Residual Masking Network
TLDR
A novel Masking Idea is proposed to boost the performance of CNN in facial expression task that uses a segmentation network to refine feature maps, enabling the network to focus on relevant information to make correct decisions.
Facial recognition technology can expose political orientation from naturalistic facial images
TLDR
Political orientation was correctly classified in 72% of liberal–conservative face pairs, remarkably better than chance, human accuracy, or one afforded by a 100-item personality questionnaire, and when comparing faces across samples.
TinaFace: Strong but Simple Baseline for Face Detection
TLDR
There is no gap between face detection and generic object detection and a strong but simple baseline method to deal with face detection named TinaFace is provided, which exceeds most of the recent face detectors with larger backbone.
Research on Face Detection Technology Based on MTCNN
TLDR
This paper compares the characteristics of Two-stage and One-stage detection models and their application in face detection tasks and MTCNN(Multi-task convolution neural network) is deeply analyzed and its implementation principle is introduced in detail.
A data-driven characterisation of natural facial expressions when giving good and bad news
TLDR
A novel protocol for eliciting natural expressions from dynamic faces, using a dimension of emotional valence as a test case, and the efficacy of using data-driven approaches to study the representation of these cues by the perceptual system is demonstrated.
An EEG-Based Multi-Modal Emotion Database with Both Posed and Authentic Facial Actions for Emotion Analysis
TLDR
A new database by collecting facial expressions, action units, and EEGs simultaneously to advance the state of the art for automatic emotion recognition is developed and released to the research community.
Human and machine validation of 14 databases of dynamic facial expressions
TLDR
The findings suggest that existing databases vary in their ability to signal specific emotions, thereby facing a trade-off between realism and ecological validity on the one end, and expression uniformity and comparability on the other.
...
...