Left Auditory Cortex and Amygdala, but Right Insula Dominance for Human Laughing and Crying

@article{Sander2005LeftAC,
  title={Left Auditory Cortex and Amygdala, but Right Insula Dominance for Human Laughing and Crying},
  author={K. Sander and H. Scheich},
  journal={Journal of Cognitive Neuroscience},
  year={2005},
  volume={17},
  pages={1519-1531}
}
  • K. Sander, H. Scheich
  • Published 2005
  • Psychology, Medicine, Computer Science
  • Journal of Cognitive Neuroscience
Evidence suggests that in animals their own species-specific communication sounds are processed predominantly in the left hemisphere. In contrast, processing linguistic aspects of human speech involves the left hemisphere, whereas processing some prosodic aspects of speech as well as other not yet well-defined attributes of human voices predominantly involves the right hemisphere. This leaves open the question of hemispheric processing of universal (species-specific) human vocalizations that… Expand
Hearing emotional sounds: category representation in the human amygdala
TLDR
Results showed that the amygdala responded more to human vocalization than to animal vocalization and sounds of inanimate objects in both negative and neutral valences, and more to animal sounds than to objects in neural condition. Expand
An Auditory Region in the Primate Insular Cortex Responding Preferentially to Vocal Communication Sounds
TLDR
These findings characterize the caudal insula as a selectively responding auditory region, possibly part of a processing stream involved in the representation of communication sounds, and uncover a basis for a supposed role of the insula in processing vocal communication sounds such as speech. Expand
FMRI activations of amygdala, cingulate cortex, and auditory cortex by infant laughing and crying
TLDR
The gender‐dependent correlates of neural activity in amygdala and ACC may reflect neural predispositions in women for responses to preverbal infant vocalizations, whereas the gender‐independent similarity of activation patterns in PCC and AC may reflect more sensory‐based and cognitive levels of neural processing. Expand
How the brain laughs Comparative evidence from behavioral, electrophysiological and neuroimaging studies in human and monkey
TLDR
It is indicated that a densely intertwined network of auditory and (pre-) motor functions subserves perceptive and expressive aspects of human laughter, and proposes the existence of equivalent brain representations of emotional tone in human and great apes. Expand
Early-latency categorical speech sound representations in the left inferior frontal gyrus
TLDR
These novel findings suggest that when humans attend to speech, the left POp mediates phonetic categorization through integration of auditory and motor information via the dorsal auditory stream. Expand
fMRI evidence for the effect of verbal complexity on lateralisation of the neural response associated with decoding prosodic emotion
TLDR
The results indicate that the likelihood of observing a notable left temporal lobe response in functional neuroimaging studies of emotional prosody comprehension depends on the verbal complexity of the prosodic emotion stimuli. Expand
Semiotic aspects of human nonverbal vocalizations: a functional imaging study
TLDR
Using a decision task, hemodynamic responses to affective bursts, vegetative sounds, and vocal gestures were measured by means of functional magnetic resonance imaging and suggested a left-hemisphere temporo-parietal ‘auditory-to-meaning interface’ related to the mechanisms of speech processing. Expand
Revealing the functions of supra-temporal and insular auditory responsive areas in humans
TLDR
These findings suggest that even the InsP has similar basic auditory response properties to the primary or non-primary cortex, it may not directly participate in the formation of auditory perception. Expand
Common and differential brain responses in men and women to nonverbal emotional vocalizations by the same and opposite sex
TLDR
Nonverbal emotional vocalizations are one of the most elementary ways of communicating in humans and brain responses to crying of the opposite sex seem to reflect upon men's efforts to perform emotional regulation and women's empathic concerns. Expand
Functional But Not Structural Networks of the Human Laryngeal Motor Cortex Show Left Hemispheric Lateralization during Syllable But Not Breathing Production
TLDR
Bilateral organization of functional LMC networks during controlled breathing supports its indispensible role in all types of laryngeal behaviors, and significant left-hemispheric lateralization of functional networks during simple but highly learned voice production suggests the readiness of the LMC network for production of a complex voluntary behavior, such as human speech. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 62 REFERENCES
Audition of laughing and crying leads to right amygdala activation in a low-noise fMRI setting.
TLDR
The results show that amygdala activation by emotionally meaningful sounds like laughing and crying is independent of the emotional involvement, suggesting the pattern recognition aspect of these sounds is crucial for this activation. Expand
Auditory perception of laughing and crying activates human amygdala regardless of attentional state.
TLDR
The amygdala results seem to be in accordance with the right-hemisphere hypothesis of emotion processing which may not be applicable as strongly to the level of auditory cortex or insula. Expand
Human temporal lobe activation by speech and nonspeech sounds.
TLDR
Recording of blood oxygenation signals from the temporal lobes of normal volunteers using functional magnetic resonance imaging indicates functional subdivision of the human lateral temporal cortex and provides a preliminary framework for understanding the cortical processing of speech sounds. Expand
Defining a left-lateralized response specific to intelligible speech using fMRI.
TLDR
The results demonstrate that there are neural responses to intelligible speech along the length of the left lateral temporal neocortex, although the precise processing roles of the anterior and posterior regions cannot be determined from this study. Expand
Left hemisphere dominance for processing vocalizations in adult, but not infant, rhesus monkeys: field experiments.
  • M. Hauser, K. Andersson
  • Biology, Psychology
  • Proceedings of the National Academy of Sciences of the United States of America
  • 1994
TLDR
Like humans, adult rhesus monkeys also evidence left hemisphere dominance for processing species-specific vocalizations, however, the emergence of such asymmetry may depend on both differential maturation of the two hemispheres and experience with the species-typical vocal repertoire. Expand
Temporal lobe regions engaged during normal speech comprehension.
TLDR
The implicit comprehension of simple narrative speech with listening to reversed versions of the narratives was contrasted, demonstrating that normal comprehension, free of task demands that do not form part of everyday discourse, engages regions distributed between the two temporal lobes, more widely on the left. Expand
Species-specific calls evoke asymmetric activity in the monkey's temporal poles
TLDR
To investigate the pattern of neural activity that might underlie this particular form of functional asymmetry in monkeys, local cerebral metabolic activity was measured while the animals listened passively to species-specific calls compared with a variety of other classes of sound. Expand
Left hemisphere advantage in the mouse brain for recognizing ultrasonic communication calls
TLDR
In the house mouse, which has a very much less elaborate forebrain than man or macaque monkey, the ultrasonic calls that are emitted by young mice to evoke maternal caring behaviour are preferentially recognized by the left hemisphere, suggesting that lateralization of this function evolved early in mammals. Expand
Adaptation to speaker's voice in right anterior temporal lobe
TLDR
Only one cortical region, located in the anterior part of the right superior temporal sulcus (STS), responded differently to the two conditions: activation relative to the silent baseline was significantly reduced when syllables were spoken by a single voice than when they were speaking by different voices. Expand
Dynamic Brain Activation during Processing of Emotional Intonation: Influence of Acoustic Parameters, Emotional Valence, and Sex
TLDR
Functional magnetic resonance imaging was used to disentangle brain activation associated with extraction of specific acoustic cues and detection of specific emotional states and might prove helpful in reconciling the controversial previous clinical and experimental data. Expand
...
1
2
3
4
5
...