Denise Y. P. Henriques

Learn More
Establishing a coherent internal reference frame for visuospatial representation and maintaining the integrity of this frame during eye movements are thought to be crucial for both perception and motor control. A stable headcentric representation could be constructed by internally comparing retinal signals with eye position. Alternatively, visual memory(More)
Motor adaptation in response to a visuomotor distortion arises when the usual motor command no longer results in the predicted sensory output. In this study, we examined if exposure to a sensory discrepancy was sufficient on its own to produce changes in reaches and recalibrate the sense of felt hand position in the absence of any voluntary movements.(More)
The present study examined the accuracy of proprioceptive localization of the hand using two paradigms. In our proprioceptive estimation paradigm, participants judged the position of a target hand relative to visual references, or their body's midline. Placement of the target hand was active (participants pushed a robot manipulandum along a constrained(More)
Goal-directed reaches are rapidly adapted following exposure to misaligned visual feedback of the hand. It has been suggested that these changes in reaches result in sensory recalibration (i.e., realigning proprioceptive estimates of hand position to match the visual estimates). In the current study we tested whether visuomotor adaptation results in(More)
The aim of this study was to: (1) quantify errors in open-loop pointing toward a spatially central (but retinally peripheral) visual target with gaze maintained in various eccentric horizontal, vertical, and oblique directions; and (2) determine the computational source of these errors. Eye and arm orientations were measured with the use of search coils(More)
Our ability to recognize and manipulate objects relies on our haptic sense of the objects' geometry. But little is known about the acuity of haptic perception compared to other senses like sight and hearing. Here, we determined how accurately humans could sense various geometric features of objects across the workspace. Subjects gripped the handle of a(More)
The aim of this study was to further understand how the brain represents spatial information for shaping aiming movements to targets. Both behavioral and neurophysiological studies have shown that the brain represents spatial memory for reaching targets in an eye-fixed frame. To date, these studies have only shown how the brain stores and updates target(More)
Eye-hand coordination requires the brain to integrate visual information with the continuous changes in eye, head, and arm positions. This is a geometrically complex process because the eyes, head, and shoulder have different centers of rotation. As a result, head rotation causes the eye to translate with respect to the shoulder. The present study examines(More)
Previous studies have shown that both young and older subjects adapt their reaches in response to a visuomotor distortion. It has been suggested that one's continued ability to adapt to a visuomotor distortion with advancing age is due to the preservation of implicit learning mechanisms, where implicit learning mechanisms include processes that realign(More)
We examined the effect of gaze direction relative to target location on reach endpoint errors made to proprioceptive and multisensory targets. We also explored if and how visual and proprioceptive information about target location are integrated to guide reaches. Participants reached to their unseen left hand in one of three target locations (left of body(More)