• Publications
  • Influence
Towards a Common Framework for Multimodal Generation: The Behavior Markup Language
TLDR
This paper describes an international effort to unify a multimodal behavior generation framework for Embodied Conversational Agents (ECAs). Expand
  • 404
  • 43
  • PDF
The Behavior Markup Language: Recent Developments and Challenges
TLDR
This paper reports on the progress made in the last year in further developing BML. Expand
  • 240
  • 28
  • PDF
Bridging the Gap between Social Animal and Unsocial Machine: A Survey of Social Signal Processing
TLDR
This paper is the first survey of the domain that jointly considers its three major aspects, namely, modeling, analysis, and synthesis of social behavior. Expand
  • 333
  • 26
  • PDF
Animated conversation: rule-based generation of facial expression, gesture & spoken intonation for multiple conversational agents
TLDR
We describe an implemented system which automatically generates and animates conversations between multiple human-like agents with appropriate and synchronized speech, intonation, facial expressions, and hand gestures. Expand
  • 721
  • 25
  • PDF
Generating Facial Expressions for Speech
TLDR
This article reports results from o program thot produces high-quolity onimotion of fociol expressions ond head movements OS outomotic in conjunction with meaning-based speech synthesis. Expand
  • 299
  • 14
  • PDF
Greta: A Simple Facial Animation Engine
TLDR
We present a 3D facial model compliant with MPEG-4 specifications; our aim was the realization of an animated model able to simulate in a rapid and believable manner the dynamics aspect of the human face. Expand
  • 108
  • 13
  • PDF
The DIT++ taxanomy for functional dialogue markup
TLDR
We present the DIT++ taxonomy of communicative functions, with some of its background and theoretical motivati ons. Expand
  • 82
  • 13
  • PDF
Studies on gesture expressivity for a virtual agent
TLDR
In this paper we describe some of the work we have conducted on behavior expressivity using a set of six parameters that act as modulation of behavior animation. Expand
  • 118
  • 10
  • PDF
From Greta's mind to her face: modelling the dynamics of affective states in a conversational embodied agent
TLDR
This paper describes the results of a research project aimed at implementing a 3D Embodied Agent that can be animated in real-time and is 'believable and expressive': that is, able to coherently communicate complex information through the combination and the tight synchronisation of verbal and nonverbal signals. Expand
  • 282
  • 9
  • PDF
Multimodal expressive embodied conversational agents
TLDR
We present our work toward the creation of a multimodal expressive Embodied Conversational Agent that exhibits nonverbal behaviors synchronized with speech. Expand
  • 133
  • 9
  • PDF