Music Mood Representations from Social Tags

Abstract

This paper presents findings about mood representations. We aim to analyze how do people tag music by mood, to create representations based on this data and to study the agreement between experts and a large community. For this purpose, we create a semantic mood space from last.fm tags using Latent Semantic Analysis. With an unsuper-vised clustering approach, we derive from this space an ideal categorical representation. We compare our community based semantic space with expert representations from Hevner and the clusters from the MIREX Audio Mood Classification task. Using dimensional reduction with a Self-Organizing Map, we obtain a 2D representation that we compare with the dimensional model from Russell. We present as well a tree diagram of the mood tags obtained with a hierarchical clustering approach. All these results show a consistency between the community and the experts as well as some limitations of current expert models. This study demonstrates a particular relevancy of the basic emotions model with four mood clusters that can be summarized as: happy, sad, angry and tender. This outcome can help to create better ground truth and to provide more realistic mood classification algorithms. Furthermore, this method can be applied to other types of representations to build better computational models.

Extracted Key Phrases

11 Figures and Tables

Showing 1-10 of 14 references

Music and Emotion: Theory and Research

  • P N Juslin, J A Sloboda
  • 2001
Highly Influential
5 Excerpts

A circumplex model of affect

  • J A Russell
  • 1980
Highly Influential
6 Excerpts

Automatic Detection of Emotion in Music: Interaction with Emotionally Sensitive Machines

  • C Laurier, P Herrera
  • 2009
2 Excerpts

Wunsch: Clustering

  • R Xu
  • 2009
1 Excerpt
Showing 1-10 of 38 extracted citations