Learn More
When a person moves through the world, the associated visual displacement of the environment in the opposite direction is not usually seen as external movement but rather as a changing view of a stable world. We measured the amount of visual motion that can be tolerated as compatible with the perception of moving within a stable world during active,(More)
Virtual reality displays introduce spatial distortions that are very hard to correct because of the difficulty of precisely modelling the camera from the nodal point of each eye. How significant are these distortions for spatial perception in virtual reality? In this study we used a helmet mounted display and a mechanical head tracker to investigate the(More)
When people move there are many visual and non-visual cues that can inform them about their movement. Simulating self motion in a virtual-reality environment thus needs to take these non-visual cues into account in addition to the normal high-quality visual display. Here we examine the contribution of visual and non-visual cues to our perception of(More)
The perceived time of occurrence of a visual stimulus may be shifted towards the onset of an auditory stimulus occurring a short time later. The effect has been attributed to auditory-visual temporal integration although an unknown portion of the shift may be explained by the different processing times of visual and auditory stimuli. Here, perceived onset(More)
We investigated the effect of auditory-visual sensory integration on visual tasks that were predominantly dependent on parvocellular processing. These tasks were (i) detecting metacontrast-masked targets and (ii) discriminating orientation differences between high spatial frequency Gabor patch stimuli. Sounds that contained no information relevant to either(More)
We measured how much the visual world could be moved during various head rotations and translations and still be perceived as visually stable. Using this as a monitor of how well subjects know about their own movement, we compared performance in different directions relative to gravity. For head rotations, we compared the range of visual motion judged(More)
Crossmodal interaction conferring enhancement in sensory processing is nowadays widely accepted. Such benefit is often exemplified by neural response amplification reported in physiological studies conducted with animals, which parallel behavioural demonstrations of sound-driven improvement in visual tasks in humans. Yet, a good deal of controversy still(More)
Although it has been previously reported that audiovisual integration can modulate performance on some visual tasks, multisensory interactions have not been explicitly assessed in the context of different visual processing pathways. In the present study, we test auditory influences on visual processing employing a psychophysical paradigm that reveals(More)
We measured the amount of visual movement judged consistent with translational head movement under normal and microgravity conditions. Subjects wore a virtual reality helmet in which the ratio of the movement of the world to the movement of the head (visual gain) was variable. Using the method of adjustment under normal gravity 10 subjects adjusted the(More)
The perceived distance between objects has been found to decrease over time in memory, demonstrating a partial failure of space constancy. Such mislocalization has been attributed to a generalized compression effect in memory. We confirmed this drift with a pair of remembered dot positions but did not find a compression of perceived distance when the space(More)