• Corpus ID: 199150420

A systematic approach for the integration of emotional context in interactive systems

  title={A systematic approach for the integration of emotional context in interactive systems},
  author={Tiago Henriques},
In interactive systems, knowing the user’s emotional state is not only important to understand and improve overall user experience, but also of the utmost relevance in scenarios where such information might foster our ability to help users manage and express their emotions (e.g., anxiety), with a strong impact on their daily life and on how they interact with others. Nevertheless, although there is a clear potential for emotionally-aware applications, several challenges preclude their wider… 



Smile When You Read This, Whether You Like It or Not: Conceptual Challenges to Affect Detection

  • A. Kappas
  • Psychology
    IEEE Transactions on Affective Computing
  • 2010
The survey by Calvo and D'Mello presents a useful overview of the progress of and issues in affect detection. They focus on emotion theories that are relevant to Affective Computing (AC) and suggest

"Tell Your Day": Developing Multimodal Interaction Applications for Children with ASD

This application aims to serve as a place for communication and information exchange among the child, her family, and teachers and supports multimodal interaction and takes on previous work regarding the definition of a Persona for a child diagnosed with ASD.

Merging Technology and Emotions: Introduction to Affective Computing

  • Tara J Brigham
  • Computer Science
    Medical reference services quarterly
  • 2017
What affective computing is, some of its benefits, and concerns with its adoption are explained and an overview of its implication in the library setting is provided and selected examples of how and where it is currently being used are offered.

Multimodal Interfaces: A Survey of Principles, Models and Frameworks

The chapter starts with the features and advantages associated with multimodal interaction, with a focus on particular findings and guidelines, as well as cognitive foundations underlying multimodAL interaction, and focuses on the driving theoretical principles, time-sensitive software architectures and multimodals fusion and fission issues.

Implicit human computer interaction through context

  • A. Schmidt
  • Computer Science
    Personal Technologies
  • 2005
It is discussed how the availability of processing power and advanced sensing technology can enable a shift in HCI from explicit interaction, such as direct manipulation GUIs, towards a more implicit interaction based on situational context.

When do we interact multimodally?: cognitive load and multimodal communication patterns

The adaptations seen in this study reflect users' efforts to self-manage limitations on working memory when task complexity increases, which is accomplished by distributing communicative information across multiple modalities, compatible with cognitive load theory of multimodal interaction.

Designing Implicit Interfaces for Physiological Computing

A conceptual framework for considering implicit input from the brain is presented, along with design principles and patterns that apply broadly to other sensor data and in domains such as aviation, education, medicine, driving, and anything involving multitasking or varying cognitive workload.

On the Creation of a Persona to Support the Development of Technologies for Children with Autism Spectrum Disorder

It is argued that Personas (and in particular, families of Personas) can be a powerful tool to tackle these challenges and be considered in the design of a first application prototype for children with ASD.

Multi-Device Applications Using the Multimodal Architecture

Adopting a multimodal interaction architecture based on the W3C recommendations, beyond the advantages to the design and development of MMI, provides, it is argued, an elegant approach to tackle multi-device interaction scenarios.

CaptureMyEmotion: A mobile app to improve emotion learning for autistic children using sensors

A mobile app called CaptureMyEmotion is described that enables autistic children to take photos, videos or sounds, and at the same time senses their arousal level using a wireless sensor, and gives the carer a means to discuss the identification and expression of emotions.