Learn More
Affective computing aims at the detection of users' mental states, in particular, emotions and dispositions during human-computer interactions. Detection can be achieved by measuring multimodal signals, namely, speech, facial expressions and/or psychobiology. Over the past years, one major approach was to identify the best features for each signal using(More)
Cognitive-technical intelligence is envisioned to be constantly available and capable of adapting to the user's emotions. However, the question is: what specific emotions should be reliably recognised by intelligent systems? Hence, in this study, we have attempted to identify similarities and differences of emotions between human-human (HHI) and(More)
BACKGROUND Research suggests that interaction between humans and digital environments characterizes a form of companionship in addition to technical convenience. To this effect, humans have attempted to design computer systems able to demonstrably empathize with the human affective experience. Facial electromyography (EMG) is one such technique enabling(More)
Facial expression comprised of single or combination of facial action units (AUs) is one of the most important communication channels transferring affective states in our social life. Electromyography (EMG) activity captured over specific facial muscles could be used for interpreting one's affective experience. The present study investigated the effects of(More)
Recently, affective computing findings demonstrated that emotion processing and recognition is important in improving the quality of human computer interaction (HCI). In the present study, new data for a robust discrimination of three emotional states (negative, neutral and positive) employing two-channel facial electromyography (EMG) over zygomaticus major(More)
  • 1