Cognitive Behaviour Analysis Based on Facial Information Using Depth Sensors

  title={Cognitive Behaviour Analysis Based on Facial Information Using Depth Sensors},
  author={Juan Manuel Fern{\'a}ndez Montenegro and Barbara Villarini and Athanasios Gkelias and Vasileios Argyriou},
Cognitive behaviour analysis is considered of high importance with many innovative applications in a range of sectors including healthcare, education, robotics and entertainment. In healthcare, cognitive and emotional behaviour analysis helps to improve the quality of life of patients and their families. Amongst all the different approaches for cognitive behaviour analysis, significant work has been focused on emotion analysis through facial expressions using depth and EEG data. Our work… 
Characterizing the State of Apathy with Facial Expression and Motion Analysis
A novel machine learning framework to classify apathetic and non-apathetic patients based on analysis of facial dynamics, entailing both emotion and facial movement, and shows that the fusion of emotions and facial local motion produces the best feature set for apathy classification.
A Spatio-Temporal Approach for Apathy Classification
This work proposes a novel spatio-temporal framework for apathy classification, which is streamlined to analyze facial dynamics and emotion in videos, and shows that fusion of characteristics such as emotion and facial dynamics in proposed deep-bi-directional GRU obtains an accuracy of 95.34% in apathetic classification.
Alzheimer's Disease Diagnosis Based on Cognitive Methods in Virtual Environments and Emotions Analysis
Novel AD's screening tests based on virtual environments using new immersive technologies combined with advanced Human Computer Interaction (HCI) systems are introduced and the use of emotion recognition to analyse AD symptoms has been proposed.


Facial emotion recognition using depth data
  • M. Szwoch, P. Pieniazek
  • Computer Science
    2015 8th International Conference on Human System Interaction (HSI)
  • 2015
An original approach is presented for facial expression and emotion recognition based only on depth channel from Microsoft Kinect sensor that can be used to support other algorithms based on optical channel, as well as using skeleton or face tracking information.
Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection
The effect of the contamination of facial muscle activities on EEG signals is analyzed and it is found that most of the emotionally valuable content in EEG features are as a result of this contamination, however, the statistical analysis showed that EEG signals still carry complementary information in presence of facial expressions.
Emotion recognition based on a novel triangular facial feature extraction method
Using the feature points extracted by MASM, two methods, one is based on statistical analysis and another one is derived from the genetic algorithm, are proposed to extract an optimal set of triangular facial features for emotion recognition.
OpenFace: An open source facial behavior analysis toolkit
OpenFace is the first open source tool capable of facial landmark detection, head pose estimation, facial action unit recognition, and eye-gaze estimation and allows for easy integration with other applications and devices through a lightweight messaging system.
Real time facial expression recognition in video using support vector machines
A real time approach to emotion recognition through facial expression in live video is presented, employing an automatic facial feature tracker to perform face localization and feature extraction and evaluating the method in terms of recognition accuracy.
EEG-Based Emotion Recognition Using Statistical Measures and Auto-Regressive Modeling
A novel approach towards classification of various human emotions based on statistically weighed autoregressive modeling of Electroencephalogram (EEG) signals is discussed and is proven to be more efficient than existing algorithms.
DEAP: A Database for Emotion Analysis ;Using Physiological Signals
A multimodal data set for the analysis of human affective states was presented and a novel method for stimuli selection is proposed using retrieval by affective tags from the website, video highlight detection, and an online assessment tool.
Robust continuous prediction of human emotions using multiscale dynamic cues
This paper details the response to the Audio/Visual Emotion Challenge (AVEC'12) whose goal is to continuously predict four affective signals describing human emotions (namely valence, arousal, expectancy and power), and proposes a particularly fast regressor-level fusion framework to merge systems based on different modalities.
Emotion Recognition from Brain Signals Using Hybrid Adaptive Filtering and Higher Order Crossings Analysis
Experimental results from the application of the HAF-HOC scheme clearly surpasses the latter in the field of emotion recognition from brain signals for the discrimination of up to six distinct emotions, providing higher classification rates up to 85.17 percent.
Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space
This paper proposes an output-associative fusion framework that incorporates correlations and covariances between the emotion dimensions and shows that on average, BLSTM-NNs outperform SVR due to their ability to learn past and future context, and the proposed system is well able to reproduce the valence and arousal ground truth obtained from human coders.