The SEMAINE Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent

@article{McKeown2012TheSD,
  title={The SEMAINE Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent},
  author={Gary McKeown and Michel F. Valstar and Roddy Cowie and Maja Pantic and Marc Schr{\"o}der},
  journal={IEEE Transactions on Affective Computing},
  year={2012},
  volume={3},
  pages={5-17}
}
SEMAINE has created a large audiovisual database as a part of an iterative approach to building Sensitive Artificial Listener (SAL) agents that can engage a person in a sustained, emotionally colored conversation. [] Key Method We then recorded user interactions with the developed system, Automatic SAL, comparing the most communicatively competent version to versions with reduced nonverbal skills.

Figures and Tables from this paper

Opinions in Interactions : New Annotations of the SEMAINE Database

A new baseline for the detection of opinions in interactions improving slightly a state of the art model with RoBERTa embeddings is proposed and the obtained results on the database are promising.

Building Autonomous Sensitive Artificial Listeners

A fully autonomous integrated real-time system which combines incremental analysis of user behavior, dialogue management, and synthesis of speaker and listener behavior of a SAL character displayed as a virtual agent is described.

MSP-IMPROV: An Acted Corpus of Dyadic Interactions to Study Emotion Perception

The MSP-IMPROV corpus is presented, a multimodal emotional database, where the goal is to have control over lexical content and emotion while also promoting naturalness in the recordings, leveraging the large size of the audiovisual database.

SEWA DB: A Rich Database for Audio-Visual Emotion and Sentiment Research in the Wild

The SEWA database of more than 2,000 minutes of audio-visual data of 398 people coming from six cultures, 50 percent female, and uniformly spanning the age range of 18 to 65 years old is introduced and is expected to push forward the research in human behaviour analysis, including cultural studies.

Building a naturalistic emotional speech corpus by retrieving expressive behaviors from existing speech corpora

This study uses the IEMOCAP and SEMAINE databases to build emotion detector systems and uses them to identify emotional behaviors from the FISHER database, which is a large conversational speech corpus recorded over the phone.

Building autonomous sensitive artificial listeners (Extended abstract)

A fully autonomous integrated real-time system which combines incremental analysis of user behaviour, dialogue management, and synthesis of speaker and listener behaviour of a SAL character displayed as a virtual agent is described.

A User Perception--Based Approach to Create Smiling Embodied Conversational Agents

A computational model, based on a corpus of users’ perceptions of smiling and nonsmiling virtual agents, enables a virtual agent to determine the appropriate smiling behavior to adopt given the interpersonal stance it wants to express.

How is emotion change reflected in manual and automatic annotations of different modalities

The SEMAINE database consists of recordings of persons talking to different virtual characters and it is investigated whether automatic emotion recognition tools assign the same emotions to the persons as the manual annotations.

Introducing the RECOLA multimodal corpus of remote collaborative and affective interactions

A new multimodal corpus of spontaneous collaborative and affective interactions in French: RECOLA is presented, which is being made available to the research community to take self-report measures of users during task completion.

RoomReader: A Multimodal Corpus of Online Multiparty Conversational Interactions

We present RoomReader, a corpus of multimodal, multiparty conversational interactions in which participants followed a collaborative student-tutor scenario designed to elicit spontaneous speech. The
...

References

SHOWING 1-10 OF 55 REFERENCES

IEMOCAP: interactive emotional dyadic motion capture database

A new corpus named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory at the University of Southern California (USC), which provides detailed information about their facial expressions and hand movements during scripted and spontaneous spoken communication scenarios.

Building Autonomous Sensitive Artificial Listeners

A fully autonomous integrated real-time system which combines incremental analysis of user behavior, dialogue management, and synthesis of speaker and listener behavior of a SAL character displayed as a virtual agent is described.

The Sensitive Artificial Listner: an induction technique for generating emotionally coloured conversation

The aim of the paper is to document and share an induction technique (The Sensitive Artificial Listener) that generates data that can be both tractable and reasonably naturalistic. The technique

A new emotion database: considerations, sources and scope

Research on the expression of emotion is underpinned by databases. Reviewing available resources persuaded us of the need to develop one that prioritised ecological validity. The basic unit of the

Perceiving emotion: towards a realistic understanding of the task

  • R. Cowie
  • Psychology
    Philosophical Transactions of the Royal Society B: Biological Sciences
  • 2009
A decade ago, perceiving emotion was generally equated with taking a sample that unquestionably signified an archetypal emotional state, and attaching the appropriate label, but computational research has shifted that paradigm in multiple ways.

EmoTV 1 : Annotation of Real-life Emotions for the Specification of Multimodal Affective Interfaces

  • Psychology
  • 2005
The development of future multimodal affective interfaces such as believable Embodied Conversational Agents requires to model relations between natural emotions and multimodal behaviors in various

Multimodal user’s affective state analysis in naturalistic interaction

This paper describes a multi-cue, dynamic approach to detect emotion in naturalistic video sequences, where input is taken from nearly real world situations, contrary to controlled recording conditions of audiovisual material.

The Vera am Mittag German audio-visual emotional speech database

This contribution presents a recently collected database of spontaneous emotional speech in German which is being made available to the research community and provides emotion labels for a great part of the data.

FEELTRACE: an instrument for recording perceived emotion in real time

FEELTRACE has resolving power comparable to an emotion vocabulary of 20 non-overlapping words, with the advantage of allowing intermediate ratings, and above all, the ability to track impressions continuously.

Using Actor Portrayals to Systematically Study Multimodal Emotion Expression: The GEMEP Corpus

It is argued in this paper that, in view of the needs of current research programs in this field, well-designed corpora of acted emotion portrayals can play a useful role.
...