Learn More
When a person moves through the world, the associated visual displacement of the environment in the opposite direction is not usually seen as external movement but rather as a changing view of a stable world. We measured the amount of visual motion that can be tolerated as compatible with the perception of moving within a stable world during active,(More)
Virtual reality displays introduce spatial distortions that are very hard to correct because of the difficulty of precisely modelling the camera from the nodal point of each eye. How significant are these distortions for spatial perception in virtual reality? In this study we used a helmet mounted display and a mechanical head tracker to investigate the(More)
The perceived time of occurrence of a visual stimulus may be shifted towards the onset of an auditory stimulus occurring a short time later. The effect has been attributed to auditory-visual temporal integration although an unknown portion of the shift may be explained by the different processing times of visual and auditory stimuli. Here, perceived onset(More)
We investigated the effect of auditory-visual sensory integration on visual tasks that were predominantly dependent on parvocellular processing. These tasks were (i) detecting metacontrast-masked targets and (ii) discriminating orientation differences between high spatial frequency Gabor patch stimuli. Sounds that contained no information relevant to either(More)
When people move there are many visual and non-visual cues that can inform them about their movement. Simulating self-motion in a virtual reality environment thus needs to take these non-visual cues into account in addition to the normal high-quality visual display. Here we examine the contribution of visual and non-visual cues to our perception of(More)
Crossmodal interaction conferring enhancement in sensory processing is nowadays widely accepted. Such benefit is often exemplified by neural response amplification reported in physiological studies conducted with animals, which parallel behavioural demonstrations of sound-driven improvement in visual tasks in humans. Yet, a good deal of controversy still(More)
When simulating self-motion, virtual reality designers ignore non-visual cues at their peril. But providing non-visual cues presents significant challenges. One approach is to accompany visual displays with corresponding real physical motion to stimulate the non-visual, motion-detecting sensory systems in a natural way. However, allowing real movement(More)
Although it has been previously reported that audiovisual integration can modulate performance on some visual tasks, multisensory interactions have not been explicitly assessed in the context of different visual processing pathways. In the present study, we test auditory influences on visual processing employing a psychophysical paradigm that reveals(More)
The retinal image of an object does not contain information about its actual size. Size must instead be inferred from extraretinal cues for which distance information makes an essential contribution. Asynchronies in the arrival time across visual and auditory sensory components of an audiovisual event can reliably cue its distance, although this cue has(More)
The perceived distance between objects has been found to decrease over time in memory, demonstrating a partial failure of space constancy. Such mislocalization has been attributed to a generalized compression effect in memory. We confirmed this drift with a pair of remembered dot positions but did not find a compression of perceived distance when the space(More)