Magnus Nordstrand

Learn More
We present our current state of development regarding animated agents applicable to affective dialogue systems. A new set of tools are under development to support the creation of animated characters compatible with the MPEG-4 facial animation standard. Furthermore, we have collected a multimodal expressive speech database including video, audio and 3D(More)
This paper reports the results of a preliminary cross-evaluation experiment run in the framework of the European research project PF-Star 1 , with the double aim of evaluating the possibility of exchanging FAP data between the involved sites and assessing the adequacy of the emotional facial gestures performed by talking heads. The results provide initial(More)
We present a high level formalism for specifying verbal and non-verbal output from a multimodal dialogue system. The output specification is XML-based and provides information about communicative functions of the output without detailing the realisa-tion of these functions. The specification can be used to control an animated character that uses speech and(More)
This paper describes a method for acquiring data for facial movement analysis and implementation in an animated talking head. We will also show preliminary data on how a number of articulatory and facial parameters for some Swedish vowels vary under the influence of expressiveness in speech and gestures. Primarily we have been concerned in expressive(More)
  • 1