Learn More
For us humans, walking is our most natural way of moving through the world. One of the major challenges in present research on navigation in virtual reality is to enable users to physically walk through virtual environments. Although treadmills, in principle, allow users to walk for extended periods of time through large virtual environments, existing(More)
In two experiments we investigated the effects of voluntary movements on temporal haptic perception. Measures of sensitivity (JND) and temporal alignment (PSS) were obtained from temporal order judgments made on intermodal auditory-haptic (Experiment 1) or intramodal haptic (Experiment 2) stimulus pairs under three movement conditions. In the baseline,(More)
Despite many recent developments in virtual reality, an effective locomotion interface which allows for normal walking through large virtual environments was until recently still lacking. Here, we describe the new CyberWalk omnidirectional treadmill system, which makes it possible for users to walk endlessly in any direction, while never leaving the(More)
Exposure to synchronous but spatially discordant auditory and visual inputs produces, beyond immediate cross-modal biases, adaptive recalibrations of the respective localization processes that manifest themselves in aftereffects. Such recalibrations probably play an important role in maintaining the coherence of spatial representations across the various(More)
Walking along a curved path requires coordinated motor actions of the entire body. Here, we investigate the relationship between head and trunk movements during walking. Previous studies have found that the head systematically turns into turns before the trunk does. This has been found to occur at a constant distance rather than at a constant time before a(More)
Exposing different sense modalities (like sight, hearing or touch) to repeated simultaneous but spatially discordant stimulations generally causes recalibration of localization processes in one or both of the involved modalities, which is manifested through aftereffects. These provide opportunities for determining the extent of the changes induced by the(More)
Spatial updating during self-motion typically involves the appropriate integration of both visual and non-visual cues, including vestibular and proprioceptive information. Here, we investigated how human observers combine these two non-visual cues during full-stride curvilinear walking. To obtain a continuous, real-time estimate of perceived position,(More)
We examined how visual recalibration of apparent sound location obtained at a particular location generalizes to untrained locations. Participants pointed toward the origin of tone bursts scattered along the azimuth, before and after repeated exposure to bursts in one particular location, synchronized with point flashes of light a constant distance to their(More)
We determined velocity discrimination thresholds and Weber fractions for sounds revolving around the listener at very high velocities. Sounds used were a broadband white noise and two harmonic sounds with fundamental frequencies of 330 Hz and 1760 Hz. Experiment 1 used velocities ranging between 288°/s and 720°/s in an acoustically treated room and(More)
Brain-damaged patients experience difficulties in recognizing a face (prosopagnosics), but they can still recognize its expression. The dissociation between these two face-related skills has served as a keystone of models of face processing. We now report that the presence of a facial expression can influence face identification. For normal viewers, the(More)