Learn More
The aim of this paper is to develop animated agents that can control multimodal instruction dialogues by monitoring user's behaviors. First, this paper reports on our Wizard-of-Oz experiments, and then, using the collected corpus, proposes a probabilis-tic model of fine-grained timing dependencies among multimodal communication behaviors: speech, gestures,(More)
  • 1