SINASense is a novel motion-based interface that serves as an educational application for children with severe or profound cognitive, sensory and physical impairments. The application makes use of computer vision to track the body movements of the user, which in turn trigger meaningful outcomes from the system. In this paper we describe the design… (More)
This paper presents an affective model that determines the emotional state of a character according to the personality traits and the experienced emotions. We consider an emotional state as the layer between personality and emotion. The proposed affective model offers a mapping between emotions and emotional states. To evidence emotional states of a virtual… (More)
We investigate the effect that stylized facial expressions have on the perception and categorization of emotions by participants with high-functioning Autism Spectrum Disorder (ASD) in contrast to two control samples: one with Attention-Deficit/Hyperactivity Disorder (ADHD), and one with neurotypically developed peers (NTD). Realtime Non-Photorealistic… (More)
This paper presents the current advances in " The Muses of Poetry " , an ongoing project that combines interaction, emotions and poetry. The goal of the project is to create an interactive installation where a virtual character not only recites poetry, but also manifests the affective content of the poem through facial expressions and voice rhythm. The… (More)
In this paper we utilize depth information to extend a line drawing algorithm, improving depth perception and object differentiation in large and spatially complex scenes. We consider different scales of features and apply a flow-based morphological filter to the scenes. Based on this two line drawing styles are defined. The proposed algorithm works in… (More)
The aim of this research is to explore the influence of static visual cues on the perception of a character's personality traits: extraversion, agreeableness and emotional stability. To measure how users perceived personality, we conducted a web-based study with 133 subjects who rated 54 images of a virtual character with varying head orientations and gaze.
This paper presents a method for the representation of mood in FACS-based facial expressions. To achieve this, a mapping of FACS Action Units (AUs) into the Pleasure-Arousal-Dominance (PAD) space is done. The PAD space is used as our mood model. From this mapping a set of rules are obtained, which compute the activation areas and intensities of each AU in… (More)