iMiGUE: An Identity-free Video Dataset for Micro-Gesture Understanding and Emotion Analysis

@article{Liu2021iMiGUEAI,
  title={iMiGUE: An Identity-free Video Dataset for Micro-Gesture Understanding and Emotion Analysis},
  author={Xin Liu and Henglin Shi and Haoyu Chen and Zitong Yu and Xiaobai Li and Guoying Zhao},
  journal={2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2021},
  pages={10626-10637}
}
  • Xin Liu, Henglin Shi, +3 authors Guoying Zhao
  • Published 1 June 2021
  • Computer Science
  • 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
We introduce a new dataset for the emotional artificial intelligence research: identity-free video dataset for Micro-Gesture Understanding and Emotion analysis (iMiGUE). Different from existing public datasets, iMiGUE focuses on nonverbal body gestures without using any identity information, while the predominant researches of emotion analysis concern sensitive biometric data, like face and speech. Most importantly, iMiGUE focuses on micro-gestures, i.e., unintentional behaviors driven by inner… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 102 REFERENCES
Survey on Emotional Body Gesture Recognition
TLDR
It is shown that for emotion recognition the quantity of labelled data is scarce and there is no agreement on clearly defined output spaces and the representations are shallow and largely based on naive geometrical representations.
ARBEE: Towards Automated Recognition of Bodily Expression of Emotion in the Wild
TLDR
The current research proposes a scalable and reliable crowdsourcing approach for collecting in-the-wild perceived emotion data for computers to learn to recognize body languages of humans and shows the effectiveness of Laban Movement Analysis features in characterizing arousal.
From individual to group-level emotion recognition: EmotiW 5.0
TLDR
The fifth Emotion Recognition in the Wild challenge 2017 aims at providing a common benchmarking platform for researchers working on different aspects of affective computing, and the particular focus of the challenge is to evaluate method in `in the wild' settings.
A study on emotion recognition from body gestures using Kinect sensor
TLDR
A comparison of classification using binary decisionTree, ensemble decision tree, k-nearest neighbour, support vector machine with radial basis function kernel and neural network classifier based on back-propagation learning is made, in terms of average classification accuracy and computation time.
Multimodal emotion recognition using deep learning architectures
TLDR
A database of multimodal recordings of actors enacting various expressions of emotions, which consists of audio and video sequences of actors displaying three different intensities of expressions of 23 different emotions along with facial feature tracking, skeletal tracking and the corresponding physiological data is presented.
Multimodal affective state recognition in serious games applications
  • A. Psaltis, K. Kaza, +4 authors P. Daras
  • Computer Science
    2016 IEEE International Conference on Imaging Systems and Techniques (IST)
  • 2016
TLDR
This paper presents an emotion recognition methodology that utilizes information extracted from multimodal fusion analysis to identify the affective state of players during gameplay scenarios, and outperforms all other classifiers.
A Bimodal Face and Body Gesture Database for Automatic Analysis of Human Nonverbal Affective Behavior
  • H. Gunes, M. Piccardi
  • Computer Science
    18th International Conference on Pattern Recognition (ICPR'06)
  • 2006
TLDR
This paper presents a bimodal database recorded by two high-resolution cameras simultaneously for use in automatic analysis of human nonverbal affective behavior.
A Multimodal Database for Affect Recognition and Implicit Tagging
TLDR
Results show the potential uses of the recorded modalities and the significance of the emotion elicitation protocol and single modality and modality fusion results for both emotion recognition and implicit tagging experiments are reported.
Recognizing emotions from videos by studying facial expressions, body postures and hand gestures
  • Mihai Gavrilescu
  • Psychology
    2015 23rd Telecommunications Forum Telfor (TELFOR)
  • 2015
A system for recognizing emotions from videos by studying facial expressions, hand gestures and body postures is presented. A stochastic context-free grammar (SCFG) containing 8 combinations of hand
Technique for automatic emotion recognition by body gesture analysis
TLDR
This paper illustrates the recent work on the analysis of expressive gesture related to the motion of the upper body in the context of emotional portrayals performed by professional actors with reference to related conceptual issues, developed techniques, and the obtained results.
...
1
2
3
4
5
...