Nicolas Ech Chafai

Learn More
Since the beginning of the SAIBA effort to unify key interfaces in the multi-modal behavior generation process, the Behavior Markup Language (BML) has both gained ground as an important component in many projects worldwide, and continues to undergo further refinement. This paper reports on the progress made in the last year in further developing BML. It(More)
At ATR, we are collecting and analysing `meetings' data using a table-top sensor device consisting of a small 360degree camera surrounded by an array of high-quality directional microphones. This equipment provides a stream of information about the audio and visual events of the meeting which is then processed to form a representation of the verbal and(More)
In this paper, we propose a study of co-verbal gesture properties that could enhance the animation of an Embodied Conversational Agent and their communicative performances. This work is based on the analysis of gesture expressivity over time that we have study from a corpus of 2D animations. First results point out two types of modulations in gesture(More)
The animation of an ECA, for a large range of animation systems, implies that the behaviour of this ECA is encoded in a representation language giving a form of realization for speech, prosody, facial expressions, gaze, head and torso movements, gestures, and so on. Interested in gestures, some representation languages already exist that are suited for a(More)
In this paper we describe our work toward the creation of affective multimodal virtual characters endowed with communicative and other socially significant capabilities. While much work in modern game AI has focused on issues such as path-finding and squad-level AI, more highly-detailed behaviour for small groups of interacting game characters has been(More)
In this paper we propose a study of co-verbal gesture expressivity during a conversational interaction. The work is based on the analysis of gesture expressivity over time, that we have conducted on two clips of 2D animations. The first results point out two types of modulations in gesture expressivity that we relate to the rhetorical functions of the(More)
In this research we investigate the relationship between emotion and cooperation in map task dialogues. It is an area were still many unsolved questions are present. One of the main open issues is the labeling of “blended” emotions, their annotation and recognition. Usually there is a low agreement among raters in “giving name” to emotions. Moreover,(More)
To maintain the user's interest into a human-machine interaction using a virtual agent, we endowed this agent with communicative and interactive abilities, and allowed this agent to communicate socially and emotionally with the user. To this end, we present different models from giving an agent the ability to perceive its environment, to the ability display(More)
  • 1