Learn More
This paper presents a framework for analysis of affective behavior starting with a reduced amount of visual information related to human upper-body movements. The main goal is to individuate a minimal representation of emotional displays based on nonverbal gesture features. The GEMEP (Geneva multimodal emotion portrayals) corpus was used to validate this(More)
This paper presents some results of a research work concerning algorithms and computational models for real-time analysis of expressive gesture in full-body human movement. As a main concrete result of our research work, we present a collection of algorithms and related software modules for the EyesWeb open architecture (freely available from(More)
This paper presents ongoing research on the modelling of expressive gesture in multimodal interaction and on the development of multimodal interactive systems explicitly taking into account the role of non-verbal expressive gesture in the communication process. In this perspective, a particular focus is on dance and music as first-class conveyors of(More)
The paper aims at (i) understanding expressiveness in gestures using computational modeling and (ii) exploit this understanding in artistic applications, where the enhancement of the expressiveness in interactive music/dance/video systems is a major goal. A multi-layered conceptual framework is presented and examples are given of its use in interactive art(More)
Endowing search engines with multimodal content indexing, sharing, and retrieval is a research challenge for the ICT community. This paper introduces a use case exploiting embodied cooperation as a paradigm for formulating social queries. It focuses on the assessment of the experience of users to this use case and on the design and exploitation of(More)
This paper presents a conceptual framework for the analysis of expressive qualities of movement. Our perspective is to model an observer of a dance performance. The conceptual framework is made of four layers, ranging from the physical signals that sensors capture to the qualities that movement communicate (e.g., in terms of emotions). The framework aims to(More)
I magine a home high-fidelity (hi-fi) music system that not only has the standard controls for volume, treble, bass, balance, and so forth, but also features " expressive knobs " possibly controlled by your movement— such as dancing—in your living room. The system lets you actively listen to, say, a Chopin piece, by changing the agogics—that is, the music(More)