Learn More
—This paper presents a framework for analysis of affective behavior starting with a reduced amount of visual information related to human upper-body movements. The main goal is to individuate a minimal representation of emotional displays based on nonverbal gesture features. The GEMEP (Geneva multimodal emotion portrayals) corpus was used to validate this(More)
This paper presents some results of a research work concerning algorithms and computational models for real-time analysis of expressive gesture in full-body human movement. As a main concrete result of our research work, we present a collection of algorithms and related software modules for the EyesWeb open architecture (freely available from(More)
This paper presents ongoing research on the modelling of expressive gesture in multimodal interaction and on the development of multimodal interactive systems explicitly taking into account the role of non-verbal expressive gesture in the communication process. In this perspective, a particular focus is on dance and music as first-class conveyors of(More)
In this paper, we present a new system, the Orchestra Explorer, enabling a novel paradigm for active fruition of sound and music content. The Orchestra Explorer allows users to physically navigate inside a virtual orchestra, to actively explore the music piece the orchestra is playing, to modify and mold the sound and music content in real-time through(More)