Isabelle Hupont

Learn More
An effective method for the automatic classification of facial expressions into emotional categories is presented. The system is able to classify the user facial expression in terms of the six Ekman's universal emotions (plus the neutral one), giving a membership confidence value to each emotional category. The method is capable of analysing any subject,(More)
The recognition of emotional information is a key step toward giving computers the ability to interact more naturally and intelligently with people. This paper presents a completely automated real-time system for facial expression's recognition based on facial features' tracking and a simple emotional classification method. Facial features' tracking uses a(More)
We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of 10 characteristic points (that are part of the MPEG4 feature points) to extract relevant emotional information (basically five distances, presence of wrinkles and mouth shape). The method defines and detects the(More)
—Affective content annotations are typically acquired from subjective manual assessments by experts in supervised laboratory tests. While well manageable, such campaigns are expensive, time-consuming and results may not be generaliz-able to larger audiences. Crowdsourcing constitutes a promising approach for quickly collecting data with wide demographic(More)
Self-reported metrics collected in crowdsourcing experiments do not always match the actual user behaviour. Therefore in the laboratory studies the visual attention, the capability of humans to selectively process the visual information with which they are confronted, is traditionally measured by means of eye trackers. Visual attention has not been(More)
The recognition of emotional information is a key step toward giving computers the ability to interact more naturally and intelligently with people. We present a simple and computationally feasible method to perform automatic emotional classification of facial expressions. We propose the use of a set of characteristic facial points (that are part of the(More)
The capability of perceiving and expressing emotions through different modalities is a key issue for the enhancement of human-computer interaction. In this paper we present a novel architecture for the development of intelligent multimodal affective interfaces. It is based on the integration of Sentic Computing, a new opinion mining and sentiment analysis(More)
—The interpretation of user facial expressions is a very useful method for emotional sensing and it constitutes an indispensable part of affective Human Computer Interface designs. Facial expressions are often classified into one of several basic emotion categories. This categorical approach seems poor to treat faces with blended emotions, as well as to(More)
The success of affective interfaces lies in the fusion of emotional information coming from different modalities. This paper proposes a scalable methodology for fusing multiple affect sensing modules, allowing the subsequent addition of new modules without having to retrain the existing ones. It relies on a 2-dimensional affective model and is able to(More)