Tony Poitschke

Learn More
This contribution presents our approach for an instrumented automatic gesture recognition system for use in Augmented Reality, which is able to differentiate static and dynamic gestures. Basing on an infrared tracking system, infrared targets mounted at the users thumbs and index fingers are used to retrieve information about position and orientation of(More)
To minimize the mental workload for the driver and to keep the increasing amount of information easily accessible, sophisticated display and interaction techniques are essential. This contribution focuses on a user-centered analysis for an authoritative grading of head-up displays (HUDs) in cars. Two studies delivered the evaluation data. In a field test,(More)
Most video-based eye trackers require a calibration procedure before measurement onset. In this work a stereo approach is presented that yields the position and orientation of the pupil in 3D space. This is achieved by analyzing the pupil images of two calibrated cameras and by a subsequent closed-form stereo reconstruction of the original pupil surface.(More)
Besides reduction of energy consumption, which implies alternate actuation and light construction, the main research domain in automobile development in the near future is dominated by driver assistance and natural driver-car communication. The ability of a car to understand natural speech and provide a human-like driver assistance system can be expected to(More)
The analysis of cognitive processes during human-machine and human-human interaction requires various tracking technologies. The human gaze is a very important cue to gather information concerning the user's intentions, current mental state, etc. To get this data the framework consisting of a highly accurate head-mounted gaze tracker combined with a low(More)
This contribution presents an approach for representing contact-analog information in an automotive Head-Up Display (HUD). Therefore, we will firstly introduce our approach for the calibration of the optical system consisting of the virtual image plane of the HUD and the drivers eyes. Afterward, we will present the used eyetracking system for adaptation of(More)
Touchscreens are becoming the preferred input device in a growing number of applications. They are interesting devices which are more and more introduced into the automotive domain. Current implementations impose problems like precise pointing or high visual attention and therefore the capabilities of projected capacitive touchscreens are investigated.(More)
This paper presents a multimodal interaction system for automotive environments that uses the driver’s eyes as main input device. Therefore, an unobtrusive and contactless sensor analyzes the driver’s eye gaze, which enables the development of gaze driven interaction concepts for operating driver assistance and infotainment systems. The following sections(More)
Within the area of advanced man-machine interaction, speech communication has always played a major role for several decades. The idea of replacing the convential input devices such as buttons and keyboard by voice control and thus increasing the comfort and the input speed considerably, seems that much attractive, that even the quite slow progress of(More)
To reduce the workload of the driver due to the increasing amount of information and functions, intelligent agents represent a promising possibility to filter the immense data sets. The intentions of the driver can be analyzed and tasks can be accomplished autonomously, i.e. without interference of the user. In this contribution, different adaptive agents(More)