Learn More
With every rapid gaze shift (saccade), our eyes experience a different view of the world. Stable perception of visual space requires that points in the new image are associated with corresponding points in the previous image. The brain may use an extraretinal eye position signal to compensate for gaze changes, or, alternatively, exploit the image contents(More)
A vivid perception of the moving form of a human figure can be obtained from a few moving light points on the joints of the body. This is known as biological motion perception. It is commonly believed that the perception of biological motion rests on image motion signals. Curiously, however, some patients with lesions to motion processing areas of the(More)
How does the brain process visual information about self-motion? In monkey cortex, the analysis of visual motion is performed by successive areas specialized in different aspects of motion processing. Whereas neurons in the middle temporal (MT) area are direction-selective for local motion, neurons in the medial superior temporal (MST) area respond to(More)
Interest in the processing of optic flow has increased recently in both the neurophysiological and the psychophysical communities. We have designed a neural network model of the visual motion pathway in higher mammals that detects the direction of heading from optic flow. The model is a neural implementation of the subspace algorithm introduced by Heeger(More)
Biological motion perception is the compelling ability of the visual system to perceive complex human movements effortlessly and within a fraction of a second. Recent neuroimaging and neurophysiological studies have revealed that the visual perception of biological motion activates a widespread network of brain areas. The superior temporal sulcus has a(More)
We extend the local energy model of position detection to cope with temporally varying position signals and the perception of relative position. The extension entails two main components. First, a form of persistence for the position signal based on the temporal impulse response function of the visual system. Secondly, we hypothesise that the perceived(More)
In immersive virtual environments (IVEs), users can control their virtual viewpoint by moving their tracked head and walking through the real world. Usually, movements in the real world are mapped one-to-one to virtual camera motions. With redirection techniques, the virtual camera is manipulated by applying gains to user motion so that the virtual world(More)
The visual perception of human movement from sparse point-light walkers is often believed to rely on local motion analysis. We investigated the role of local motion in the perception of human walking, viewed from the side, in different tasks. The motion signal was manipulated by varying point lifetime. We found the task of coherence discrimination, commonly(More)
Successful navigation through an environment requires precise monitoring of direction and distance traveled (”path integration” or ”dead reckoning”). Previous studies in blindfolded human subjects showed that velocity information arising from vestibular and somatosensory signals can be used to reproduce passive linear displacements. In these studies, visual(More)
In the postgenomic era, one of the most interesting and important challenges is to understand protein interactions on a large scale. The physical interactions between protein domains are fundamental to the workings of a cell: in multi-domain polypeptide chains, in multi-subunit proteins and in transient complexes between proteins that also exist(More)