Introducing a Dataset of Emotional and Color Responses to Music
The paper presents a new dataset that captures the effect of mood on visual and auditory perception of music. With an online survey, we have collected a dataset of over 6600 responses capturing users’ mood, emotions evoked and expressed by music and the perception of color with regard to emotions and music. We describe the methodology of gathering the responses and present two new models for capturing users’ emotional states: the MoodGraph and MoodStripe. Also, general research questions and goals, as well as possible future applications of the collected dataset, are being discussed.