René Tünnermann

Learn More
Touch can create a feeling of intimacy and connectedness. In this work we propose feelabuzz, a system to transmit movements of one mobile phone to the vibration actuator of another one. This is done in a direct, non-abstract way, without the use of pattern recognition techniques in order not to destroy the feel for the other. This means that the tactile(More)
Our living and work spaces are becoming ever more enriched with all kinds of electronic devices. Many of these are too small to provide the possibility to control or monitor them. Ambient intelligence is integrating many such devices in what are called <i>smart environments</i> to form a network of interweaved sensors, data displays and everyday devices. We(More)
In this work we present a method to intuitively issue control over devices in smart environments, to display data that smart objects and sensors provide, and to create and manipulate flows of information in smart environments. This makes it easy to customize smart environments by linking arbitrary data sources to various display modalities on the fly.(More)
In this paper, we present our work towards an auditory display that is capable of supporting a fuel-efficient operation of vehicles. We introduce five design approaches for employing the auditory modality for a fuel economy display. Furthermore, we have implemented a novel auditory display based on one of these approaches, foussing on giving feedback on the(More)
Interfaces supporting bi-manual interaction offer great benefits. In recent years, a variety of multi-touch systems have even shown new possibilities for multi-finger input. However, multi-finger interactions do not always show better performance. We propose an interface consisting of a large amount of minimal tangible objects called tangible grains(More)
Touch can convey emotions on a very direct level. We propose feelabuzz, a system implementing a remote touch connection using standard mobile phone hardware. Accelerometer data is mapped to vibration strength on two smartphones connected via the Internet. This is done using direct mapping techniques, without any abstraction of the acceleration signal. By(More)
This paper presents novel interaction modes for Model-Based Sonification (MBS) via interactive surfaces. We first discuss possible interactions for MBS on a multi-touch surface. This is followed by a description of the Data Sonogram Sonification and the Growing Neural Gas Sonification Model and their implementation for the multi-touch interface.(More)
  • 1