Shared neural mechanisms for processing emotions in music and vocalizations

  title={Shared neural mechanisms for processing emotions in music and vocalizations},
  author={Alice Mado Proverbio and Francesco de Benedetto and Martina Guazzone},
  journal={European Journal of Neuroscience},
  pages={1987 - 2007}
The neural mechanisms involved in the processing of vocalizations and music were compared, in order to observe possible similarities in the encoding of their emotional content. Positive and negative emotional vocalizations (e.g. laughing, crying) and violin musical stimuli digitally extracted from them were used as stimuli. They shared the melodic profile and main pitch/frequency characteristics. Participants listened to vocalizations or music while detecting rare auditory targets (bird… 

Common neural bases for processing speech prosody and music: An integrated model

An integrated model is proposed depicting a possible common circuit for processing the emotional content of music, vocalizations and speech, which might explain some universal and relatively innate brain reaction to music.

Multimodal recognition of emotions in music and language

We investigated through electrophysiological recordings how music-induced emotions are recognized and combined with the emotional content of written sentences. Twenty-four sad, joyful, and

ERP Markers of Valence Coding in Emotional Speech Processing

Enhanced salience of musical sounds in singers and instrumentalists.

Music training has been linked to facilitated processing of emotional sounds. However, most studies have focused on speech, and less is known about musicians' brain responses to other emotional

Multimodal Recognition of Emotions in Music and Facial Expressions

The study investigated the neural processing of congruent vs. incongruent affective audiovisual information by means of ERPs, and showed the crucial role of Inferior and Superior Temporal Gyri in the multimodal representation of emotional information extracted from faces and music.

The neural basis of authenticity recognition in laughter and crying

Overall, this work provides the first electroencephalographic examination of authenticity discrimination and proposes that authenticity processing of others’ vocalisations is initiated early, along that of their emotional content or category, attesting for its evolutionary relevance for trust and bond formation.

Using machine learning analysis to interpret the relationship between music emotion and lyric features

The performance of random forest regressions reveals that, for the recognition models of perceived valence, adding lyric features can significantly improve the prediction effect of the model using audio features only and, unlike the uselessness of the lyric features in the arousal recognition model, several lyric features played important roles in the valence recognition model.

Spatial Connectivity and Temporal Dynamic Functional Network Connectivity of Musical Emotions Evoked by Dynamically Changing Tempo

The paired t-test showed that music with a decreasing tempo evokes stronger activation of ICs within DMN and SMN than that with an increasing tempo, which indicated that faster music is more likely to enhance listeners’ emotions with multifunctional brain activities even when the tempo is slowing down.

Research on Multimodal Music Emotion Recognition Method Based on Image Sequence

  • Zhao Yu
  • Computer Science
    Sci. Program.
  • 2021
A multimodal music emotion recognition method based on image sequence is studied, which shows that the recognition and classification results are in good agreement with the actual results, and the classification and recognition accuracy is high.



ERP Markers of Valence Coding in Emotional Speech Processing

Emotional expressions in voice and music: Same code, same effect?

Results indicate that similar mechanisms support emotional inferences from vocalizations and music and that these mechanisms tap on a general system involved in social cognition.

Brain processing of consonance/dissonance in musicians and controls: a hemispheric asymmetry revisited

Overall, the data show a finer and more tuned neural representation of pitch intervals in musicians, linked to a marked specialization of their left temporal cortex (BA21/38).

Neural Correlates of Consonance, Dissonance, and the Hierarchy of Musical Pitch in the Human Brainstem

The results suggest that brainstem neural mechanisms mediating pitch processing show preferential encoding of consonant musical relationships and, furthermore, preserve the hierarchical pitch relationships found in music, even for individuals without formal musical training.

Music and emotions: from enchantment to entrainment

The similarities and differences in the neural substrates underlying these “complex” music‐evoked emotions relative to other more “basic” emotional experiences are reviewed, suggesting that these emotions emerge through a combination of activation in emotional and motivational brain systems that confer its valence to music.

The voice of emotion: an FMRI study of neural responses to angry and happy vocal expressions.

The results identify a network of regions implicated in the processing of vocal emotion, and suggest a particularly salient role for vocal expressions of happiness.

Musical chords and emotion: Major and minor triads are processed for emotion

The early stages of processing that are involved suggest that major and minor chords have deeply connected emotional meanings, rather than superficially attributed ones, indicating that minor triad possess negative emotional connotations and major triads possess positive emotional connotation.

Cross‐classification of musical and vocal emotions in the auditory cortex

It is found that local fMRI patterns in the bilateral auditory cortex and upper premotor regions support above‐chance emotion classification when training and testing sets are performed within the same timbre category, providing evidence for a shared neural code for processing musical and vocal emotions.

The sound and the fury: Late positive potential is sensitive to sound affect.

It is suggested that altered neural activities for affective visual stimuli are enhanced by concurrent affective sounds, paving the way toward an understanding of the construction of multimodal affective experience.

Mapping Aesthetic Musical Emotions in the Brain

Functional neuroimaging with parametric analyses based on the intensity of felt emotions reveal a differentiated recruitment across emotions of networks involved in reward, memory, self-reflective, and sensorimotor processes, which may account for the unique richness of musical emotions.