René Tünnermann

Learn More
Interfaces supporting bi-manual interaction offer great benefits. In recent years, a variety of multi-touch systems have even shown new possibilities for multi-finger input. However, multi-finger interactions do not always show better performance. We propose an interface consisting of a large amount of minimal tangible objects called tangible grains(More)
Touch can create a feeling of intimacy and connectedness. In this work we propose feelabuzz, a system to transmit movements of one mobile phone to the vibration actuator of another one. This is done in a direct, non-abstract way, without the use of pattern recognition techniques in order not to destroy the feel for the other. This means that the tactile(More)
Although most of us strive to develop a sustainable and less resource-intensive behavior, this unfortunately is a difficult task, because often we are unaware of relevant information, or our focus of attention lies elsewhere. Based on this observation, we present a new approach for an unobtrusive and affective ambient auditory information display to become(More)
In order to support drivers in adopting a more fuel efficient driving style, there currently exists a range of fuel economy displays, providing drivers feedback on instantaneous and longterm fuel consumption. While these displays rely almost completely on visual components for conveying relevant information, we argue that there are significant benefits in(More)
Our living and work spaces are becoming ever more enriched with all kinds of electronic devices. Many of these are too small to provide the possibility to control or monitor them. Ambient intelligence is integrating many such devices in what are called <i>smart environments</i> to form a network of interweaved sensors, data displays and everyday devices. We(More)
In this work we present a method to intuitively issue control over devices in smart environments, to display data that smart objects and sensors provide, and to create and manipulate flows of information in smart environments. This makes it easy to customize smart environments by linking arbitrary data sources to various display modalities on the fly.(More)
This paper presents and evaluates interactive sonifications to support periphery sensing and joint attention in situations with a limited field of view. Particularly Head-mounted AR displays limit the field of view and thus cause users to miss relevant activities of their interaction partner, such as object interactions or deictic references that normally(More)
This paper presents novel interaction modes for Model-Based Sonification (MBS) via interactive surfaces. We first discuss possible interactions for MBS on a multi-touch surface. This is followed by a description of the Data Sonogram Sonification and the Growing Neural Gas Sonification Model and their implementation for the multi-touch interface.(More)