• Corpus ID: 14212085

Multimodal Emotion Recognition Integrating Affective Speech with Facial Expression

@inproceedings{Zhao2014MultimodalER,
  title={Multimodal Emotion Recognition Integrating Affective Speech with Facial Expression},
  author={Xiaoming Zhao and Shiqing Zhang and Xiaohua Wang and Gang Zhang},
  year={2014}
}
In recent years, emotion recognition has attracted extensive interest in signal processing, artificial intelligence and pattern recognition due to its potential applications to human-computer-interaction (HCI). Most previously published works in the field of emotion recognition devote to performing emotion recognition by using either affective speech or facial expression. However, Affective speech and facial expression are mainly two important ways of human emotion expression, as they are the… 

Figures and Tables from this paper

Human emotion detection through speech and facial expressions
TLDR
A proposed hybrid system through facial expressions and speech is used to estimate basic emotions of a person when he is engaged in a conversational secession and showed better performance with respect to basic emotional classes than the rest.
A Literature Review on Emotion Recognition Using Various Methods
TLDR
This paper tried to explore the relevant significant works, their techniques, and the effectiveness of the methods and the scope of the improvement of the results.
A Review on Facial Expression Recognition: Feature Extraction and Classification
TLDR
This paper is a survey of FER addressing the most two important aspects of designing an FER system: facial feature extraction for static images and dynamic image sequences and facial expression classification.
Emotion recognition based on multimodal fusion using mixture of brain emotional learning
1. PhD Student of Artificial Intelligence, Department of Computer Engineering, Science and Reserach Branch, Islamic Azad University, Tehran, Iran 2. Professor of Department of Energy Engineering and
Decisional-Emotional Support System for a Synthetic Agent : Influence of Emotions in Decision-Making Toward the Participation of Automata in Society
Emotion influences our actions, and this means that emotion has subjective decision value. Emotions, properly interpreted and understood, of those affected by decisions provide feedback to actions ...
Decision-making content of an agent affected by emotional feedback provided by capture of human’s emotions through a Bimodal System
Affective computing allows for widening the view of the complex world in human-machine interaction through the comprehension of emotions, which allows an enriched coexistence of natural interaction
Decision-making content of an agent affected by emotional feedback provided by capture of human's emotions through a Bimodal System
Affective computing allows for widening the view of the complex world in human-machine interaction through the comprehension of emotions, which allows an enriched coexistence of natural interaction

References

SHOWING 1-10 OF 53 REFERENCES
Human emotion recognition system using optimally designed SVM with different facial feature extraction techniques
TLDR
All six universally recognized basic emotions namely angry, disgust, fear, happy, sad and surprise along with neutral one are recognized in this research.
Speech Emotion Recognition Using an Enhanced Kernel Isomap for Human-Robot Interaction
TLDR
A new nonlinear dimensionality reduction method, called ‘enhanced kernel isometric mapping’ (EKIsomap), is proposed and applied for speech emotion recognition in human-robot interaction and is used to nonlinearly extract the low-dimensional discriminating embedded data representations from the original high-dimensional speech features with a striking improvement of performance on thespeech emotion recognition tasks.
Speech emotion recognition approaches in human computer interaction
TLDR
A wide range of features employed for speech emotion recognition and the acoustic characteristics of those features are presented and some important parameters such as: precision, recall, F-measure and recognition rate of the features are analyzed.
Emotion recognition in speech signal: experimental study, development, and application
TLDR
An emotion recognition agent was created that is able to analyze telephone quality speech signal and distinguish between two emotional states --"agitation" and "calm" -with the accuracy of 77%.
Facial expression recognition based on Local Binary Patterns: A comprehensive study
Speech emotion recognition using hidden Markov models
This paper introduces a first approach to emotion recognition using RAMSES, the UPC’s speech recognition system. The approach is based on standard speech recognition technology using hidden
Facial expression recognition based on local binary patterns and local fisher discriminant analysis
TLDR
This paper shows that the presented facial expression recognition method based on LBP and LFDA obtains the best recognition accuracy of 90.7% with 11 reduced features, outperforming the other used methods such as principal component analysis (PCA), linear discriminant analysis (LDA), locality preserving projection (LPP).
Facial Expression Recognition Using Sparse Representation
TLDR
Experimental results on two popular facial expression databases demonstrate the promising performance of the presented SRC method on facial expression recognition tasks, outperforming the other used methods.
Local binary patterns for multi-view facial expression recognition
Facial Expression Recognition
TLDR
This chapter describes the problem space for facial expression analysis, which includes multiple dimensions: level of description, individual differences in subjects, transitions among expressions, intensity of facial expression, deliberate versus spontaneous expression, head orientation and scene complexity, image acquisition and resolution, reliability of ground truth, databases, and the relation to other facial behaviors or nonfacial behaviors.
...
1
2
3
4
5
...