Assessing Facial Expressions in Virtual Reality Environments

  title={Assessing Facial Expressions in Virtual Reality Environments},
  author={Catarina Runa Miranda and Ver{\'o}nica Costa Teixeira Orvalho},
Humans rely on facial expressions to transmit information, like mood and intentions, usually not provided by the verbal communication channels. The recent advances in Virtual Reality (VR) at consumer-level (Oculus VR 2014) created a shift in the way we interact with each other and digital media. Today, we can enter a virtual environment and communicate through a 3D character. Hence, to the reproduction of the users’ facial expressions in VR scenarios, we need the on-the-fly animation of the… 
1 Citations

Figures and Tables from this paper

Facial Expression Recognition Under Partial Occlusion from Virtual Reality Headsets based on Transfer Learning
  • Bita Houshmand, N. Khan
  • Computer Science
    2020 IEEE Sixth International Conference on Multimedia Big Data (BigMM)
  • 2020
This paper proposes a geometric model to simulate occlusion resulting from a Samsung Gear VR headset that can be applied to existing FER datasets and adopts a transfer learning approach, starting from two pretrained networks, namely VGG and ResNet.


Facial performance sensing head-mounted display
A novel HMD that enables 3D facial performance-driven animation in real-time that is suitable for social interactions in virtual worlds and a short calibration step to readjust the Gaussian mixture distribution of the mapping before each use is proposed.
Realtime facial animation with on-the-fly correctives
It is demonstrated that using an adaptive PCA model not only improves the fitting accuracy for tracking but also increases the expressiveness of the retargeted character.
Displaced dynamic expression regression for real-time facial tracking and animation
This work presents a fully automatic approach to real-time facial tracking and animation with a single video camera that learns a generic regressor from public image datasets to infer accurate 2D facial landmarks as well as the 3D facial shape from 2D video frames.
3D shape regression for real-time facial animation
A real-time performance-driven facial animation system based on 3D shape regression that learns an accurate, user-specific face alignment model from an easily acquired set of training data, generated from images of the user performing a sequence of predefined facial poses and expressions.
Happy mouth and sad eyes: scanning emotional facial expressions.
Eye-tracking was used to monitor scanning behavior of healthy participants while looking at different facial expressions, and results confirm the relevance of the eyes and mouth in emotional decoding, but they demonstrate that not all facial expressions with different emotional content are decoded equally.
Facial Communicative Signals
Facial communicative signals (head gestures, eye gaze, and facial expressions) as nonverbal feedback in human-robot interaction are investigated and the human ability to recognize this spontaneous facial feedback is investigated.
MPEG-4 Facial Animation: The Standard,Implementation and Applications
The Editors put the MPEG-4 FA standard against the historical background of research on facial animation and model-based coding, and provide a brief history of the development of the standard itself.
Does My Face FIT?: A Face Image Task Reveals Structure and Distortions of Facial Feature Representation
The results show that spatial knowledge of one’s own face is remarkably poor, suggesting that face representation may not contribute strongly to self-awareness.
Abstract muscle action procedures for human face animation
A new way of controlling human face animation and synchronizing speech is proposed and a methodology for animating the face of synthetic actors based on three levels: the AMA-procedure level, the expression level and the script level.
Real-time emotion recognition novel method for geometrical facial features extraction
This work proposes and validate a novel methodology for facial features extraction to automatically recognize facial emotions, achieving an accurate degree of detection and obtain a processing pipeline that allows classification of the six basic Ekman's emotions in real-time, not requiring any manual intervention or prior information of facial traits.