Learn More
Motor adaptation in response to a visuomotor distortion arises when the usual motor command no longer results in the predicted sensory output. In this study, we examined if exposure to a sensory discrepancy was sufficient on its own to produce changes in reaches and recalibrate the sense of felt hand position in the absence of any voluntary movements.(More)
The aim of this study was to: (1) quantify errors in open-loop pointing toward a spatially central (but retinally peripheral) visual target with gaze maintained in various eccentric horizontal, vertical, and oblique directions; and (2) determine the computational source of these errors. Eye and arm orientations were measured with the use of search coils(More)
Goal-directed reaches are rapidly adapted following exposure to misaligned visual feedback of the hand. It has been suggested that these changes in reaches result in sensory recalibration (i.e., realigning proprioceptive estimates of hand position to match the visual estimates). In the current study we tested whether visuomotor adaptation results in(More)
Our ability to recognize and manipulate objects relies on our haptic sense of the objects' geometry. But little is known about the acuity of haptic perception compared to other senses like sight and hearing. Here, we determined how accurately humans could sense various geometric features of objects across the workspace. Subjects gripped the handle of a(More)
The aim of this study was to further understand how the brain represents spatial information for shaping aiming movements to targets. Both behavioral and neurophysiological studies have shown that the brain represents spatial memory for reaching targets in an eye-fixed frame. To date, these studies have only shown how the brain stores and updates target(More)
The present study examined the accuracy of proprioceptive localization of the hand using two paradigms. In our proprioceptive estimation paradigm, participants judged the position of a target hand relative to visual references, or their body’s midline. Placement of the target hand was active (participants pushed a robot manipulandum along a constrained(More)
Eye–hand coordination is geometrically complex. To compute the location of a visual target relative to the hand, the brain must consider every anatomical link in the chain from retinas to fingertips. Here we focus on the first three links, studying how the brain handles information about the angles of the two eyes and the head. It is known that people, even(More)
Haptic perception of shape is based on kinesthetic and tactile information synthesized across space and time. We studied this process by having subjects move along the edges of multisided shapes and then remember and reproduce the shapes. With eyes closed, subjects moved a robot manipulandum whose force field was programmed to simulate a quadrilateral(More)
We examined the effect of gaze direction relative to target location on reach endpoint errors made to proprioceptive and multisensory targets. We also explored if and how visual and proprioceptive information about target location are integrated to guide reaches. Participants reached to their unseen left hand in one of three target locations (left of body(More)
Remembered object locations are stored in an eye-fixed reference frame, so that every time the eyes move, spatial representations must be updated for the arm-motor system to reflect the target's new relative position. To date, studies have not investigated how the brain updates these spatial representations during other types of eye movements, such as(More)