• Corpus ID: 141602138

Automatic Classification of musical mood by content-based analysis

  title={Automatic Classification of musical mood by content-based analysis},
  author={Cyril Laurier},
La musica en formato digital forma parte de nuestras vidas. Automatizar la organizacion de estos datos es un gran desafio. En esta tesis, nos centramos en la clasificacion automatica de musica a partir de la deteccion de la emocion que comunica. Para conseguirlo, proponemos modelos usando informaciones extraidas de la senal de audio mediante tecnicas de procesamiento de senales, aprendizaje automatico y recuperacion de informacion. Primero, estudiamos como los miembros de una red social… 
From music similarity to music recommendation : computational approaches based on audio features and metadata
La cantidad de musica disponible en formato digital crece de forma impetuosa. Hay grandes cantidades de musica disponibles para los oyentes, y para tratarlas se requiere organizacion y filtracion
Algorithms and representations for supporting online music creation with large-scale audio databases
La rapida adopcion de Internet y de las tecnologias web ha creado una oportunidad para hacer musica colaborativa mediante el intercambio de informacion en linea. Sin embargo, las aplicaciones
Labeling data and developing supervised framework for hindi music mood analysis
The present article addresses the issue of Hindi music mood classification by considering important issues like taxonomy development, annotation and automated mood classification, and reveals that features like timbre, rhythm, and intensity are associated with enhanced classification accuracy.
Emotion-based Analysis and Classification of Audio Music
Tese de Doutoramento em Ciencias e Tecnologias da Informacao, apresentada ao Departamento de Engenharia Informatica da Faculdade de Ciencias e Tecnologia da Universidade de Coimbra
The role of emotion and context in musical preference
This thesis compile a Western popular music emotion dataset based on online social tags, and present a music emotion classification system using audio features corresponding to four different musical dimensions, and investigates emotional uses of music in different music-listening situational contexts.
A robust music genre classification approach for global and regional music datasets evaluation
A set of features to classify genres of music, obtained by a methodical selection of important features used in the literature of Music Information Retrieval and Music Emotion Recognition, are proposed and a new music dataset called BMD (Brazilian Music Dataset) is proposed.
Hybrid Approach of Structural Lyric and Audio Segments for Detecting Song Emotion
This research proposes a method for detecting song emotion based on integrated song lyrics and audio using Sum of matrix and Majority Voting Concept to combine audio and lyrics features in emotion detection.
Musical Texture and Expressivity Features for Music Emotion Recognition
A set of novel emotionally-relevant audio features are presented to help improving the classification of emotions in audio music and developed a set of new algorithms to capture information related with musical texture and expressive techniques, the two most lacking concepts.
Active Learning for User-Tailored Refined Music Mood Detection
This thesis, which is built on top of the work by Cyril Laurier and Perfecto Herrera in the Music Technology Group, deals with the need for expanding current mood tags to more specific and complex emotions and the use of active learning techniques is explored.
Fusion of musical contents, brain activity and short term physiological signals for music- emotion recognition
Results show that certain features from skin conductance and heart rate variability were found efficient in the emotion classification task, thus the role of the activation of the autonomic nervous system in emotion recognition.


Audio content processing for automatic music genre classification : descriptors, databases, and classifiers
Esta tesis estudia la clasificacion automatica de generos musicales, basada en el analisis del contenido de la senal de audio, planteando sus problemas y proponiendo soluciones. Se propone un estudio
Situated, perceptual, emotive and cognitive music systems: a psychologically grounded approach to interactive music composition
Esta tesis introduce un nuevo sistema de composicion situada e interactiva llamado SMuSe (por Situated Music Server). Dicho sistema esta basado en principios extraidos de la ciencia cognitiva
A mood-based music classification and exploration system
Mood classification of music is an emerging domain of music information retrieval. In the approach presented here features extracted from an audio file are used in combination with the affective
Automatic Mood Classification Using TF*IDF Based on Lyrics
The results show that word oriented metrics provide a valuable source of information for automatic mood classification of music, based on lyrics only, and there is no large difference in mood prediction based on the mood division.
Music Mood Representations from Social Tags
This study demonstrates a particular relevancy of the basic emotions model with four mood clusters that can be sum-marized as: happy, sad, angry and tender.
Improving mood classification in music digital libraries by combining lyrics and audio
The results show that combining lyrics and audio significantly outperformed systems using audio-only features and that the hybrid lyric + audio system needed fewer training samples to achieve the same or better classification accuracies than systems using lyrics or audio singularly.
Music Mood and Theme Classification - a Hybrid Approach
This paper develops algorithms for classifying music songs by moods and themes by extending existing approaches by also considering the songs’ thematic dimensions and by using social data from the Last.fm music portal, as support for the classification tasks.
Content-based mood classification for photos and music: a generic multi-modal classification framework and evaluation approach
This paper presents a generic multi-modal mood classification framework which uses various audio-visual features and multiple classifiers and presents a novel music and photo mood classification reference set for evaluation.
Indexing music by mood: design and integration of an automatic content-based annotator
A user evaluation in the context of the PHAROS search engine, asking people about the utility, interest and innovation of this technology in real world use cases is reported, demonstrating the usability of this tool to annotate large-scale databases.
Exploring Mood Metadata: Relationships with Genre, Artist and Usage Metadata
The relationships that mood has with genre, artist and usage metadata are explored and a cluster-based approach is recommended that overcomes specific term-related problems by creating a relatively small set of data-derived “mood spaces” that could form the ground-truth for a proposed MIREX “Automated Mood Classification” task.