Learn More
We investigated changes of regional activation in the frontal cortices as assessed by changes of hemoglobin oxygenation during walking at 3 and 5 km/h and running at 9 km/h on a treadmill using a near-infrared spectroscopic (NIRS) imaging technique. During the acceleration periods immediately preceded reaching the steady walking or running speed, the levels(More)
It has been proposed that motor imagery contains an element of sensory experiences (kinesthetic sensations), which is a substitute for the sensory feedback that would normally arise from the overt action. No evidence has been provided about whether kinesthetic sensation is centrally simulated during motor imagery. We psychophysically tested whether motor(More)
Neuroimaging studies have shown activity in the amygdala in response to facial expressions of emotion, but the specific role of the amygdala remains unknown. We hypothesized that the amygdala is involved in emotional but not basic sensory processing for facial expressions. To test this hypothesis, we manipulated the face directions of emotional expressions(More)
Task-related motion is a major source of noise in functional magnetic-resonance imaging (fMRI) time series. The motion effect usually persists even after perfect spatial realignment is achieved. Here, we propose a new method to remove a certain type of task-related motion effect that persists after realignment. The procedure consists of the following: the(More)
Neuroimaging studies suggest that the amygdala integrates emotional expression and gaze direction, but the findings are inconsistent. We hypothesized that the dynamic facial expressions, which are more salient stimuli than static facial expressions are, would reveal the integration of emotional expression and gaze direction in amygdala activity. To test(More)
Individuals can experience negative emotions (e.g., embarrassment) accompanying self-evaluation immediately after recognizing their own facial image, especially if it deviates strongly from their mental representation of ideals or standards. The aim of this study was to identify the cortical regions involved in self-recognition and self-evaluation along(More)
To investigate the hypothesis that early visual processing of stimuli might be boosted by signals of emotionality, we analyzed event related potentials (ERPs) of twelve right-handed normal subjects. Gray-scale still images of faces with emotional (fearful and happy) or neutral expressions were presented randomly while the subjects performed gender(More)
Sign language activates the auditory cortex of deaf subjects, which is evidence of cross-modal plasticity. Lip-reading (visual phonetics), which involves audio-visual integration, activates the auditory cortex of hearing subjects. To test whether audio-visual cross-modal plasticity occurs within areas involved in cross-modal integration, we used functional(More)
Neuroimaging studies have reported greater activation of the human amygdala in response to emotional facial expressions, especially for fear. However, little is known about how fast this activation occurs. We investigated this issue by recording the intracranial field potentials of the amygdala in subjects undergoing pre-neurosurgical assessment (n=6). The(More)
To investigate connectivity between primary somatosensory area (S1) and striate cortex (V1) in the blind, we used dynamic causal modeling of functional MRI acquired while 15 blind (9 early-onset and 6 late-onset) and 24 sighted subjects performed a tactile Braille discrimination task with their right hand. Five regions of interest were selected from either(More)