Learn More
The prototype of a gaze-controlled, head-mounted camera (EyeSeeCam) was developed that provides the functionality for fundamental studies on human gaze behavior even under dynamic conditions like locomotion. EyeSeeCam incorporates active visual exploration by saccades with image stabilization during head, object, and surround motion just as occurs in human(More)
We have developed a low-latency combined eye and head tracker suitable for teleoperating a remote robotic head in real-time. Eye and head movements of a human (wizard) are tracked and replicated by the robot with a latency of 16.5 ms. The tracking is achieved by three fully synchronized cameras attached to a head mount. One forward-looking, wide-angle(More)
The analysis of cognitive processes during human-machine and human-human interaction requires various tracking technologies. The human gaze is a very important cue to gather information concerning the user's intentions, current mental state, etc. To get this data the framework consisting of a highly accurate head-mounted gaze tracker combined with a low(More)
Most video-based eye trackers require a calibration procedure before measurement onset. In this work a stereo approach is presented that yields the position and orientation of the pupil in 3D space. This is achieved by analyzing the pupil images of two calibrated cameras and by a subsequent closed-form stereo reconstruction of the original pupil surface.(More)
Head impulses are a routine clinical test of semicircular canal function. At the bedside, they are used to detect malfunctioning of the horizontal semicircular canals. So far, 3-D-search-coil recording is required to reliably test anterior and posterior canal function and to determine the gain of the vestibulo-ocular reflex (VOR). Search-coil recording(More)
This contribution presents an approach for representing contact-analog information in an automotive Head-Up Display (HUD). Therefore, we will firstly introduce our approach for the calibration of the optical system consisting of the virtual image plane of the HUD and the drivers eyes. Afterward, we will present the used eyetracking system for adaptation of(More)
BACKGROUND People with color vision deficiencies report numerous limitations in daily life, restricting, for example, their access to some professions. However, they use basic color terms systematically and in a similar manner as people with normal color vision. We hypothesize that a possible explanation for this discrepancy between color perception and(More)
Visual search can be accelerated when properties of the target are known. Such knowledge allows the searcher to direct attention to items sharing these properties. Recent work indicates that information about properties of non-targets (i.e., negative cues) can also guide search. In the present study, we examine whether negative cues lead to different search(More)