Learn More
In this article, we describe a new approa ch to enhance presence technologies. First we discuss the strong relationship between cognitive processes and emotions and how human physiology is uniquely affected when experiencing each emotion. Then we introduce our prototype Multimodal Affective User Interface. In the remaining of the paper we describe the(More)
Accounting for a patient's emotional state is integral in medical care. Telehealth research attests to the challenge clinicians must overcome in assessing patient emotional state when modalities are limited (Pettinari and Jessopp, 2001). The extra effort involved in addressing this challenge requires attention, skill, and time. Large caseloads may not(More)
—The development of an autonomous social robot, Cherry, is occurring in tandem with studies gaining potential user preferences, likes, dislikes, and perceptions of her features. Thus far, results have indicated that individuals (a) believe that service robots with emotion and personality capabilities would make them more acceptable in everyday roles in(More)
In this paper, we describe algorithms developed to analyze physiological signals associated with emotions, in order to recognize the affective states of users via non-invasive technologies. We propose a framework for modeling user's emotions from the sensory inputs and interpretations of our multi-modal system. We also describe examples of circumstances(More)
  • 1