Learn More
We investigated changes of regional activation in the frontal cortices as assessed by changes of hemoglobin oxygenation during walking at 3 and 5 km/h and running at 9 km/h on a treadmill using a near-infrared spectroscopic (NIRS) imaging technique. During the acceleration periods immediately preceded reaching the steady walking or running speed, the levels(More)
Neuroimaging studies have shown activity in the amygdala in response to facial expressions of emotion, but the specific role of the amygdala remains unknown. We hypothesized that the amygdala is involved in emotional but not basic sensory processing for facial expressions. To test this hypothesis, we manipulated the face directions of emotional expressions(More)
It has been proposed that motor imagery contains an element of sensory experiences (kinesthetic sensations), which is a substitute for the sensory feedback that would normally arise from the overt action. No evidence has been provided about whether kinesthetic sensation is centrally simulated during motor imagery. We psychophysically tested whether motor(More)
Neuroimaging studies suggest that the amygdala integrates emotional expression and gaze direction, but the findings are inconsistent. We hypothesized that the dynamic facial expressions, which are more salient stimuli than static facial expressions are, would reveal the integration of emotional expression and gaze direction in amygdala activity. To test(More)
BACKGROUND The eye gaze of other individuals conveys important social information and can trigger multiple psychological activities; some of which, such as emotional reactions and attention orienting, occur very rapidly. Although some neuroscientific evidence has suggested that the amygdala may be involved in such rapid gaze processing, no evidence has been(More)
Individuals can experience negative emotions (e.g., embarrassment) accompanying self-evaluation immediately after recognizing their own facial image, especially if it deviates strongly from their mental representation of ideals or standards. The aim of this study was to identify the cortical regions involved in self-recognition and self-evaluation along(More)
Dynamic facial expressions of emotion constitute natural and powerful media of communication between individuals. However, little is known about the neural substrate underlying the processing of dynamic facial expressions of emotion. We depicted the brain areas by using fMRI with 22 right-handed healthy subjects. The facial expressions are dynamically(More)
We evaluated the neural substrates of cross-modal binding and divided attention during audio-visual speech integration using functional magnetic resonance imaging. The subjects (n = 17) were exposed to phonemically concordant or discordant auditory and visual speech stimuli. Three different matching tasks were performed: auditory-auditory (AA),(More)
Sign language activates the auditory cortex of deaf subjects, which is evidence of cross-modal plasticity. Lip-reading (visual phonetics), which involves audio-visual integration, activates the auditory cortex of hearing subjects. To test whether audio-visual cross-modal plasticity occurs within areas involved in cross-modal integration, we used functional(More)
Humans can judge grating orientation by touch. Previous studies indicate that the extrastriate cortex is involved in tactile orientation judgments, suggesting that this area is related to visual imagery. However, it has been unclear which neural mechanisms are crucial for the tactile processing of orientation, because visual imagery is not always required(More)