Learn More
To look at or reach for what we see, spatial information from the visual system must be transformed into a motor plan. The posterior parietal cortex (PPC) is well placed to perform this function, because it lies between visual areas, which encode spatial information, and motor cortical areas. The PPC contains several subdivisions, which are generally(More)
The traditional approach to studying brain function is to measure physiological responses to controlled sensory, motor and cognitive paradigms. However, most of the brain's energy consumption is devoted to ongoing metabolic activity not clearly associated with any particular stimulus or behaviour. Functional magnetic resonance imaging studies in humans(More)
Neural responses are typically characterized by computing the mean firing rate, but response variability can exist across trials. Many studies have examined the effect of a stimulus on the mean response, but few have examined the effect on response variability. We measured neural variability in 13 extracellularly recorded datasets and one intracellularly(More)
Behaviors such as sensing an object and then moving your eyes or your hand toward it require that sensory information be used to help generate a motor command, a process known as a sensorimotor transformation. Here we review models of sensorimotor transformations that use a flexible intermediate representation that relies on basis functions. The use of(More)
Recent experiments are reviewed that indicate that sensory signals from many modalities, as well as efference copy signals from motor structures, converge in the posterior parietal cortex in order to code the spatial locations of goals for movement. These signals are combined using a specific gain mechanism that enables the different coordinate frames of(More)
In order to direct a movement towards a visual stimulus, visual spatial information must be combined with postural information. For example, directing gaze (eye plus head) towards a visible target requires the combination of retinal image location with eye and head position to determine the location of the target relative to the body. Similarly,(More)
The mechanism for object location in the environment, and the perception of the external world as stable when eyes, head and body are moved, have long been thought to be centred on the posterior parietal cortex. However, head position signals, and their integration with visual and eye position signals to form a representation of space referenced to the(More)
In previous experiments, we showed that cells in the parietal reach region (PRR) in monkey posterior parietal cortex code intended reaching movements in an eye-centered frame of reference. These cells are more active when an arm compared with an eye movement is being planned. Despite this clear preference for arm movements, we now report that PRR neurons(More)
A gain field, the scaling of a tuned neuronal response by a postural signal, may help support neuronal computation. Here, we characterize eye and hand position gain fields in the parietal reach region (PRR). Eye and hand gain fields in individual PRR neurons are similar in magnitude but opposite in sign to one another. This systematic arrangement produces a(More)