Robert Neßelrath

Learn More
We present two user interfaces: one multimodal dialogue system and one task-based calendar which assist people with mild cognitive disabilities affecting their concentration and memory. A new middleware based upon a new open industrial standard—ISO/IEC 24752 Universal Remote Console (URC)—allows access to any network services or appliances as well as(More)
This paper introduces a system for gesture based interaction with smart environments. The framework we present connects gesture recognition results with control commands for appliances in a smart home that are accessed through a mid-dleware based on the ISO 24752 standard URC (Universal Remote Console). Gesture recognition is realized by applying three(More)
In this paper we present a new approach towards assistance and awareness systems that exploits the duality of automated homes; the status of devices and appliances in the real world on the one hand, and their digital representation on the other hand. If the digital representation is embedded in a virtual world, i.e., a 3-D model that reflects the actual(More)
—We aim to enhance the quality of life of elderly and people with mild cognitive disabilities through technology. This includes increasing their autonomy, enhancing their security, and preventing isolation by staying socially connected. One way to do so is enabling them a life in their familiar surroundings as long as possible. We present the successful(More)
This paper describes the design and the design process of orthographic feedback in a computer-assisted vocabulary learning (CAVL) application that is targeted at blind language learners. It discusses current research findings of vocabulary and spelling acquisition, as well as special needs of blind computer users. CAVL applications often assume the user's(More)
In this paper, we describe a mobile Business-to-Business (B2B) interaction system. The mobile device supports users in accessing a service platform. A multimodal dialogue system allows a business expert to intuitively search and browse for services in a real-world production pipeline. We implemented a distributed client-server dialogue application for(More)
This paper introduces a multimodal dialogue system, which facilitates access to the digital home for people who suffer from cognitive disabilities. The user interface is implemented on a smartphone and allows interaction via speech and pointing gestures. A consistent control concept and a meaningful graphical design allow an intuitive handling of the(More)
We present a framework for integrating dynamic gestures as a new input modality into arbitrary applications. The framework allows training new gestures and recognizing them as user input with the help of machine learning algorithms. The precision of the gesture recognition is evaluated with special attention to the elderly. We show how this functionality is(More)
We present our planned efforts within the SensHome project in building a corpus of daily activities in smart environments. The recordings consisting of measurable events as provided by the instrumentation along with video and audio recordings. SensHome foresees a three-step development where the instrumentation is verified in a dual-reality setting followed(More)