Toward Studying Music Cognition with Information Retrieval Techniques: Lessons Learned from the OpenMIIR Initiative

@article{Stober2017TowardSM,
  title={Toward Studying Music Cognition with Information Retrieval Techniques: Lessons Learned from the OpenMIIR Initiative},
  author={Sebastian Stober},
  journal={Frontiers in Psychology},
  year={2017},
  volume={8}
}
  • S. Stober
  • Published 3 August 2017
  • Psychology
  • Frontiers in Psychology
As an emerging sub-field of music information retrieval (MIR), music imagery information retrieval (MIIR) aims to retrieve information from brain activity recorded during music cognition–such as listening to or imagining music pieces. This is a highly inter-disciplinary endeavor that requires expertise in MIR as well as cognitive neuroscience and psychology. The OpenMIIR initiative strives to foster collaborations between these fields to advance the state of the art in MIIR. As a first step… 

A Statistical Inference Framework for Understanding Music-Related Brain Activity

TLDR
The encephalographic modality was chosen to capture brain activity as it is more widely available since of-the-shelf devices recording such responses are already affordable unlike more expensive brain imaging techniques.

Unsupervised Spectral Clustering of Music-Related Brain Activity

  • S. Ntalampiras
  • Computer Science
    2019 15th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)
  • 2019
TLDR
A suitably-designed unsupervised spectral clustering scheme highlights the connection existing between responses and the audio structure of the music classes corresponding to the three tasks, and shows that there is a strong connection w.r.t stimuli identification and meter classification tasks.

Extraction and classification of tempo stimuli from electroencephalography recordings using convolutional recurrent attention model

TLDR
A novel convolutional recurrent attention model (CRAM) is proposed to extract and classify features corresponding to tempo stimuli from EEG recordings of listeners who listened with concentration to the tempo of musics, and achieves promising results, outperforming recent methods.

On the encoding of natural music in computational models and human brains

TLDR
How system identification techniques and computational models of music have advanced the authors' understanding of how the human brain processes the textures and structures of music and how the processed information evokes emotions is highlighted.

EEG-based decoding and recognition of imagined music

TLDR
The results of this study demonstrate the feasibility of decoding imagined music, thereby setting the stage for new neuroscientific experiments in this area as well as for new types of brain-computer interfaces based on music imagination.

Decoding Music in the Human Brain Using EEG Data

TLDR
This work proposes studying the relationship between the EEG data of subjects listening to audio and the audio feature vectors modeled after the semantic vectors in computational linguistics to provide new insight into how the brain processes and understands music.

An Analytical Framework of Tonal and Rhythmic Hierarchy in Natural Music Using the Multivariate Temporal Response Function

TLDR
An analytical framework of a linearized encoding analysis based on a set of music information retrieval features to investigate the rapid cortical encoding of tonal and rhythmic hierarchies in natural music is presented and can uncover associations of ongoing brain activity with relevant musical features.

Classifying Songs with EEG

TLDR
This research study investigates how resonance in the EEG response correlates with individual aesthetic enjoyment and hypothesizes that the intensity of an aesthetic experience is based on the degree to which a participants EEG entrains to the perceptual input.

A Cognitive Information Retrieval Using POP Inference Engine Approaches

TLDR
The chapter highlights the framework of a cognitive rule-based engine in which preference queries are dealt with while keeping in mind the intention of the user, their performance, and optimization.

EVO* 2019 - Late-Breaking Abstracts Volume

This volume contains the Late-Breaking Abstracts submitted to the EVO* 2019 Conference, that took place in Leipzig, from 24 to 26 of April. These papers where presented as short talks and also at the

References

SHOWING 1-10 OF 63 REFERENCES

Towards Music Imagery Information Retrieval: Introducing the OpenMIIR Dataset of EEG Recordings from Music Perception and Imagination

TLDR
A public domain dataset of electroencephalography recordings taken during music perception and imagination is presented to enable music information retrieval researchers interested in these new MIIR challenges to easily test and adapt their existing approaches for music analysis like fingerprinting, beat tracking, or tempo estimation on EEG data.

BRINGING THE SONG ON YOUR MIND BACK TO YOUR EARS

TLDR
This paper aims to stimulate research in the field of MIIR by laying a roadmap for future work by including music imagery—possibly forming a sub-discipline which could be called Music Imagery Information Retrieval (MIIR).

Neuroimaging Methods for Music Information Retrieval: Current Findings and Future Prospects

TLDR
It is shown that certain approaches currently used in a neuroscientific setting align with those used in MIR research, and implications for potential areas of future research are discussed.

Supplementary Material for the paper "Tempo Estimation from the EEG signal during Perception and Imagination of Music"

TLDR
Electroencephalography recordings taken during the perception and the imagination of music contain information to estimate the tempo of a musical piece, and future directions involving convolutional neural networks (CNNs) are proposed that will allow the results to apply to build a brain-computer interface.

A Simple Method to Determine if a Music Information Retrieval System is a “Horse”

  • Bob L. Sturm
  • Computer Science
    IEEE Transactions on Multimedia
  • 2014
We propose and demonstrate a simple method to explain the figure of merit (FoM) of a music information retrieval (MIR) system evaluated in a dataset, specifically, whether the FoM comes from the

Shared processing of perception and imagery of music in decomposed EEG

Brain Beats: Tempo Extraction from EEG Data

TLDR
A case study applying beat tracking techniques to extract the tempo from electroencephalography (EEG) recordings obtained from people listening to music stimuli demonstrates how the tempo extraction from EEG signals can be stabilized by applying different fusion approaches on the mid-level tempogram features.

Learning music similarity from relative user ratings

TLDR
It is shown that music similarity measures learnt on relative ratings can be significantly better than a standard Euclidian metric, depending on the choice of learning algorithm, feature sets and application scenario.

Adaptive Multimodal Exploration of Music Collections

TLDR
An immersive multimodal exploration environment which extends the presentation of a song collection in a video-game-like virtual 3-D landscape by carefully adjusted spatialized plackback of songs and proves the importance of auditory feedback for music exploration and shows that the system is capable of adjusting to different notions of similarity.
...