Gathering a Dataset of Multi-Modal Mood-Dependent Perceptual Responses to Music

Abstract

The paper presents a new dataset that captures the effect of mood on visual and auditory perception of music. With an online survey, we have collected a dataset of over 6600 responses capturing users’ mood, emotions evoked and expressed by music and the perception of color with regard to emotions and music. We describe the methodology of gathering the responses and present two new models for capturing users’ emotional states: the MoodGraph and MoodStripe. Also, general research questions and goals, as well as possible future applications of the collected dataset, are being discussed.

Extracted Key Phrases

7 Figures and Tables

Cite this paper

@inproceedings{Pesek2014GatheringAD, title={Gathering a Dataset of Multi-Modal Mood-Dependent Perceptual Responses to Music}, author={Matevz Pesek and Primoz Godec and Mojca Poredos and Gregor Strle and Joze Guna and Emilija Stojmenova Duh and Matevz Pogacnik and Matija Marolt}, booktitle={UMAP Workshops}, year={2014} }