Touch activates human auditory cortex

  title={Touch activates human auditory cortex},
  author={Martin Sch{\"u}rmann and Gina Caetano and Yevhen Hlushchuk and Veikko Jousm{\"a}ki and Riitta Hari},
Integration of sight, hearing and touch in human cerebral cortex
This thesis describes studies of the interactions between somatosensation, vision and audition using functional Magnetic Resonance Imaging of normal human subjects as the primary method, and investigates a possible homologue to a macaque multisensory area that integrates visual, auditory and tactile information.
BOLD Responses to Tactile Stimuli in Visual and Auditory Cortex Depend on the Frequency Content of Stimulation
Although some brain areas preferentially process information from a particular sensory modality, these areas can also respond to other modalities. Here we used fMRI to show that such responsiveness
Lateralized enhancement of auditory cortex activity and increased sensitivity to self-generated sounds.
Functional magnetic resonance imaging is used to demonstrate bilateral enhancement in the auditory cortex to self-generated versus externally generated sounds and it is found that this enhancement is stronger when the sound-producing hand is contralateral toThe auditory cortex.
Auditory Frequency Representations in Human Somatosensory Cortex
The results demonstrate that auditory frequency representations can be distributed over brain regions traditionally considered to be dedicated to somatosensation, and reveals a number of candidate brain areas that could support general temporal frequency processing and mediate the extensive and robust perceptual interactions between audition and touch.
Sound enhances touch perception
The results indicate that auditory information influences touch perception in highly systematic ways and suggest that similar coding mechanisms may underlie the processing of information from these different sensory modalities.
Enhanced Auditory Evoked Activity to Self-Generated Sounds Is Mediated by Primary and Supplementary Motor Cortices
Accumulating evidence demonstrates that responses in auditory cortex to auditory consequences of self-generated actions are modified relative to the responses evoked by identical sounds generated by
Temporal factors affecting somatosensory–auditory interactions in speech processing
A dynamic modulation of somatosensory–auditory convergence is demonstrated and suggested the contribution of somatoensory information for speech processing process is dependent on the specific temporal order of sensory inputs in speech production.


Magnetoencephalographic Correlates of Audiotactile Interaction
The relatively high interindividual variability of the observed interaction could indicate that the way subjects perceive the simultaneous presentation of auditory and tactile stimuli differs, which represents potential neural substrates for multisensory integration.
Somatosensory input to auditory association cortex in the macaque monkey.
The convergence of somatosensory and auditory inputs in within subregions of macaque auditory cortex is investigated to suggest a potential neural substrate for multisensory integration at an early stage of auditory cortical processing.
Nonauditory Events of a Behavioral Procedure Activate Auditory Cortex of Highly Trained Monkeys
It is speculated that the multimodal corepresentation in the auditory cortex has arisen from the intensive practice of the subjects with the behavioral procedure and that it facilitates the performance of audiomotor tasks in proficient subjects.
Hands help hearing: facilitatory audiotactile interaction at low sound-intensity levels.
The subjects chose on average 12% lower intensities for the probe tone when they touched the tube, suggesting facilitatory interaction between auditory and tactile senses in normal-hearing subjects.
Auditory-somatosensory multisensory processing in auditory association cortex: an fMRI study.
It is demonstrated that auditory and somatosensory inputs converge in a subregion of human auditory cortex along the superior temporal gyrus, and the multisensory region identified in the present investigation may be the human homologue of CM.
The role of tactile aids in providing information about acoustic stimuli.
A framework is outlined for describing normal listening situations as a hierarchy of tasks requiring increasingly complex analysis of the acoustic waveform, including sound detection, environmental sound identification, syllable rhythm and stress categorization, phoneme and word identification, and comprehension of connected speech.