Learn More
What visual information do we use to guide movement through our environment? Self-movement produces a pattern of motion on the retina, called optic flow. During translation, the direction of movement (locomotor direction) is specified by the point in the flow field from which the motion vectors radiate - the focus of expansion (FoE) [1-3]. If an eye(More)
Drosophila is one of the most important model organisms in biological science. Our observation that a complex trait, eye size, evolves appreciably over a relatively short time suggests that care should be taken when flies from long-term cultures are used as wild-type controls. Acknowledgments We thank John Root and Alvaro Gill Ferreira for the fly stocks,(More)
The vast majority of research on optic flow (retinal motion arising because of observer movement) has focused on its use in heading recovery and guidance of locomotion. Here we demonstrate that optic flow processing has an important role in the detection and estimation of scene-relative object movement during self movement. To do this, the brain identifies(More)
We have recently suggested that the brain uses its sensitivity to optic flow in order to parse retinal motion into components arising due to self and object movement (e.g. Rushton, S. K., & Warren, P. A. (2005). Moving observers, 3D relative motion and the detection of object movement. Current Biology, 15, R542-R543). Here, we explore whether stereo(More)
A moving observer needs to be able to estimate the trajectory of other objects moving in the scene. Without the ability to do so, it would be difficult to avoid obstacles or catch a ball. We hypothesized that neural mechanisms sensitive to the patterns of motion generated on the retina during self-movement (optic flow) play a key role in this process,(More)
The use of virtual reality (VR) display systems has escalated over the last 5 yr and may have consequences for those working within vision research. This paper provides a brief review of the literature pertaining to the representation of depth in stereoscopic VR displays. Specific attention is paid to the response of the accommodation system with its(More)
We have recently suggested that neural flow parsing mechanisms act to subtract global optic flow consistent with observer movement to aid in detecting and assessing scene-relative object movement. Here, we examine whether flow parsing can occur independently from heading estimation. To address this question we used stimuli comprising two superimposed optic(More)
An object that moves is spotted almost effortlessly; it "pops out". When the observer is stationary, a moving object is uniquely identified by retinal motion. This is not so when the observer is also moving; as the eye travels through space all scene objects change position relative to the eye producing a complicated field of retinal motion. Without the(More)
A pair of projectiles travelling on parallel trajectories produce differing patterns of retinal motion when they originate at different distances. For an observer to recognise that the two trajectories are parallel she must "factor out" the effect of distance on retinal motion. The observer faces a similar problem when physically parallel trajectories(More)
Human observers can perceive their direction of heading with a precision of about a degree. Several computational models of the processes underpinning the perception of heading have been proposed. In the present study we set out to assess which of four candidate models best captured human performance; the four models we selected reflected key differences in(More)