Takanori Kochiyama

Learn More
Dynamic facial expressions of emotion constitute natural and powerful media of communication between individuals. However, little is known about the neural substrate underlying the processing of dynamic facial expressions of emotion. We depicted the brain areas by using fMRI with 22 right-handed healthy subjects. The facial expressions are dynamically(More)
We investigated changes of regional activation in the frontal cortices as assessed by changes of hemoglobin oxygenation during walking at 3 and 5 km/h and running at 9 km/h on a treadmill using a near-infrared spectroscopic (NIRS) imaging technique. During the acceleration periods immediately preceded reaching the steady walking or running speed, the levels(More)
To investigate the hypothesis that early visual processing of stimuli might be boosted by signals of emotionality, we analyzed event related potentials (ERPs) of twelve right-handed normal subjects. Gray-scale still images of faces with emotional (fearful and happy) or neutral expressions were presented randomly while the subjects performed gender(More)
Individuals can experience negative emotions (e.g., embarrassment) accompanying self-evaluation immediately after recognizing their own facial image, especially if it deviates strongly from their mental representation of ideals or standards. The aim of this study was to identify the cortical regions involved in self-recognition and self-evaluation along(More)
It has been proposed that motor imagery contains an element of sensory experiences (kinesthetic sensations), which is a substitute for the sensory feedback that would normally arise from the overt action. No evidence has been provided about whether kinesthetic sensation is centrally simulated during motor imagery. We psychophysically tested whether motor(More)
Neuroimaging studies have shown activity in the amygdala in response to facial expressions of emotion, but the specific role of the amygdala remains unknown. We hypothesized that the amygdala is involved in emotional but not basic sensory processing for facial expressions. To test this hypothesis, we manipulated the face directions of emotional expressions(More)
Humans can judge grating orientation by touch. Previous studies indicate that the extrastriate cortex is involved in tactile orientation judgments, suggesting that this area is related to visual imagery. However, it has been unclear which neural mechanisms are crucial for the tactile processing of orientation, because visual imagery is not always required(More)
Sign language activates the auditory cortex of deaf subjects, which is evidence of cross-modal plasticity. Lip-reading (visual phonetics), which involves audio-visual integration, activates the auditory cortex of hearing subjects. To test whether audio-visual cross-modal plasticity occurs within areas involved in cross-modal integration, we used functional(More)
Task-related motion is a major source of noise in functional magnetic-resonance imaging (fMRI) time series. The motion effect usually persists even after perfect spatial realignment is achieved. Here, we propose a new method to remove a certain type of task-related motion effect that persists after realignment. The procedure consists of the following: the(More)
The metabolic change that occurs during early development of the human brain was studied with functional magnetic resonance imaging (fMRI), in which the signal change reflects the balance between the supply and the demand of oxygen during stimulus-related neuronal activation. The subjects were 16 infants, aged < 1 year. They were sedated with pentobarbital,(More)