Learn More
Establishing a coherent internal reference frame for visuospatial representation and maintaining the integrity of this frame during eye movements are thought to be crucial for both perception and motor control. A stable headcentric representation could be constructed by internally comparing retinal signals with eye position. Alternatively, visual memory(More)
Our ability to recognize and manipulate objects relies on our haptic sense of the objects' geometry. But little is known about the acuity of haptic perception compared to other senses like sight and hearing. Here, we determined how accurately humans could sense various geometric features of objects across the workspace. Subjects gripped the handle of a(More)
The aim of this study was to: (1) quantify errors in open-loop pointing toward a spatially central (but retinally peripheral) visual target with gaze maintained in various eccentric horizontal, vertical, and oblique directions; and (2) determine the computational source of these errors. Eye and arm orientations were measured with the use of search coils(More)
Eye-hand coordination requires the brain to integrate visual information with the continuous changes in eye, head, and arm positions. This is a geometrically complex process because the eyes, head, and shoulder have different centers of rotation. As a result, head rotation causes the eye to translate with respect to the shoulder. The present study examines(More)
This study addressed the question of how the three-dimensional (3-D) control strategy for the upper arm depends on what the forearm is doing. Subjects were instructed to point a laser-attached in line with the upper arm-toward various visual targets, such that two-dimensional (2-D) pointing directions of the upper arm were held constant across different(More)
The saccade generator updates memorized target representations for saccades during eye and head movements. Here, we tested if proprioceptive feedback from the arm can also update handheld object locations for saccades, and what intrinsic coordinate system(s) is used in this transformation. We measured radial saccades beginning from a central light-emitting(More)
Eye-hand coordination is geometrically complex. To compute the location of a visual target relative to the hand, the brain must consider every anatomical link in the chain from retinas to fingertips. Here we focus on the first three links, studying how the brain handles information about the angles of the two eyes and the head. It is known that people, even(More)
This review surveys results from a new approach to the problem of haptic sensing, in which subjects use primarily proximal arm movements to explore the shapes of virtual objects. These shapes are generated using a robotically controlled manipulandum. We begin by summarizing distortions of simple geometric properties (such as the length and orientation of(More)
Most models of spatial vision and visuomotor control reconstruct visual space by adding a vector representing the site of retinal stimulation to another vector representing gaze angle. However, this scheme fails to account for the curvatures in retinal projection produced by rotatory displacements in eye orientation. In particular, our simulations(More)
Haptic perception of shape is based on kinesthetic and tactile information synthesized across space and time. We studied this process by having subjects move along the edges of multisided shapes and then remember and reproduce the shapes. With eyes closed, subjects moved a robot manipulandum whose force field was programmed to simulate a quadrilateral(More)