Michael R. M. Jenkin

Learn More
The measurement of image disparity is a fundamental precursor to binocular depth estimation. Recently, Jenkin and Jepson (1988) and Sanger (1988) described promising methods based on the output phase behaviour of band-pass Gabor lters. Here we discuss further justiication for such techniques based on the stability of band-pass phase behaviour as a function(More)
We address the problem of robotic exploration of a graphlike world, where no distance or orientation metric is assumed of the world. The robot is assumed to be able to autonomously traverse graph edges, recognize when it has reached a vertex, and enumerate edges incident upon the current vertex relative to the edge via which it entered the current vertex.(More)
A key di culty in the design of multi-agent robotic systems is the size and complexity of the space of possible designs. In order to make principled design decisions, an understanding of the many possible system con gurations is essential. To this end, we present a taxonomy that classi es multiagent systems according to communication, computational and(More)
The direction of ‘up’ has traditionally been measured by setting a line (luminous if necessary) to the apparent vertical, a direction known as the ‘subjective visual vertical’ (SVV); however for optimum performance in visual skills including reading and facial recognition, an object must to be seen the ‘right way up’—a separate direction which we have(More)
We demonstrate that humans can use optic flow to estimate distance travelled when appropriate scaling information is provided. Eleven subjects were presented with visual targets in a virtual corridor. They were then provided with optic flow compatible with movement along the corridor and asked to indicate when they had reached the previously presented(More)
When a person moves through the world, the associated visual displacement of the environment in the opposite direction is not usually seen as external movement but rather as a changing view of a stable world. We measured the amount of visual motion that can be tolerated as compatible with the perception of moving within a stable world during active,(More)
Surprisingly little is known of the perceptual consequences of visual or vestibular stimulation in updating our perceived position in space as we move around. We assessed the roles of visual and vestibular cues in determining the perceived distance of passive, linear self motion. Subjects were given cues to constant-acceleration motion: either optic flow(More)
To enhance presence, facilitate sensory motor performance, and avoid disorientation or nausea, virtualreality applications require the percept of a stable environment. End-end tracking latency (display lag) degrades this illusion of stability and has been identified as a major fault of existing virtual-environment systems. Oscillopsia refers to the(More)
Visual motion can be a cue to travel distance when the motion signals are integrated. Distance estimates from visually simulated self-motion are imprecise, however. Previous work in our labs has given conflicting results on the imprecision: experiments by Frenz and Lappe had suggested a general underestimation of travel distance, while results from Redlick,(More)