Corpus ID: 14683934

Towards Music Imagery Information Retrieval: Introducing the OpenMIIR Dataset of EEG Recordings from Music Perception and Imagination

@inproceedings{Stober2015TowardsMI,
  title={Towards Music Imagery Information Retrieval: Introducing the OpenMIIR Dataset of EEG Recordings from Music Perception and Imagination},
  author={Sebastian Stober and Avital Sternin and Adrian Mark Owen and Jessica A. Grahn},
  booktitle={ISMIR},
  year={2015}
}
Music imagery information retrieval (MIIR) systems may one day be able to recognize a song from only our thoughts. As a step towards such technology, we are presenting a public domain dataset of electroencephalography (EEG) recordings taken during music perception and imagination. We acquired this data during an ongoing study that so far comprises 10 subjects listening to and imagining 12 short music fragments – each 7–16s long – taken from well-known pieces. These stimuli were selected from… Expand
Toward Studying Music Cognition with Information Retrieval Techniques: Lessons Learned from the OpenMIIR Initiative
  • S. Stober
  • Psychology, Medicine
  • Front. Psychol.
  • 2017
TLDR
First results of MIIR experiments using these OpenMIIR datasets are reported on and it is pointed out how these findings could drive new research in cognitive neuroscience. Expand
Unsupervised Spectral Clustering of Music-Related Brain Activity
  • S. Ntalampiras
  • Computer Science
  • 2019 15th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)
  • 2019
TLDR
A suitably-designed unsupervised spectral clustering scheme highlights the connection existing between responses and the audio structure of the music classes corresponding to the three tasks, and shows that there is a strong connection w.r.t stimuli identification and meter classification tasks. Expand
A Statistical Inference Framework for Understanding Music-Related Brain Activity
TLDR
The encephalographic modality was chosen to capture brain activity as it is more widely available since of-the-shelf devices recording such responses are already affordable unlike more expensive brain imaging techniques. Expand
Shared Generative Representation of Auditory Concepts and EEG to Reconstruct Perceived and Imagined Music
TLDR
Using a multi-view deep generative model the feasibility of learning a shared latent representation of brain activity and auditory concepts, such as rhythmical motifs appearing across different instrumentations, is demonstrated. Expand
MAD-EEG: an EEG dataset for decoding auditory attention to a target instrument in polyphonic music
TLDR
Preliminary results obtained with a state-of-the-art stimulus reconstruction algorithm commonly used for speech stimuli show that the audio representation reconstructed from the EEG response is more correlated with that of the attended source than with the one of the unattended source, proving the dataset to be suitable for such kind of studies. Expand
NMED-T: A Tempo-Focused Dataset of Cortical and Behavioral Responses to Naturalistic Music
TLDR
The Naturalistic Music EEG Dataset—Tempo (NMED-T), an open dataset of electrophysiological and behavioral responses collected from 20 participants who heard a set of 10 commercially available musical works, is introduced. Expand
Classifying music perception and imagination using EEG
This study explored whether we could accurately classify perceived and imagined musical stimuli from EEG data. Successful EEG-based classification of what an individual is imagining could pave theExpand
Mind the Beat: Detecting Audio Onsets from EEG Recordings of Music Listening
TLDR
A deep learning approach to predicting audio event onsets in electroencephalogram (EEG) recorded from users as they listen to music is proposed and found that the RNN was able to produce results that reflected its ability to generalize better than the other methods. Expand
Information Retrieval from Neurophysiological Signals
TLDR
This thesis shows the possibility of creating a user-centric music/movie recommender system by employing neurophysiological signals and aimed at one of the fundamental goals in brain decoding which is reconstructing the external stimuli using only the brain features. Expand
NATURALISTIC MUSIC EEG DATASET—HINDI (NMED-H) 2.0: NEW RELEASE AND CROSS-DATASET COMPATIBILITY
Recent neuroscience research in Music Information Retrieval has been facilitated by publicly available electroencephalography (EEG) datasets such as the OpenMIIR dataset [7] and the NaturalisticExpand
...
1
2
3
4
...

References

SHOWING 1-10 OF 34 REFERENCES
Name that tune: Decoding music from the listening brain
TLDR
Electroencephalography is used to detect heard music from the brain signal, hypothesizing that the time structure in music makes it especially suitable for decoding perception from EEG signals. Expand
Measuring the mind’s ear: EEG of music imagery
In the current study we use electroencephalography (EEG) to detect heard music from the brain signal, hypothesizing that the time structure in music makes it especially suitable for decodingExpand
Shared processing of perception and imagery of music in decomposed EEG
TLDR
By decomposing the data with principal component analysis (PCA), similar component distributions are found to explain most of the variance in each experiment, and it is shown that the frontal and central components have multiple parts that are differentially active during perception and imagination. Expand
BRINGING THE SONG ON YOUR MIND BACK TO YOUR EARS
Most existing Music Information Retrieval (MIR) technologies require a user to use a query interface to search for a musical document. The mental image of the desired music is likely much richer thanExpand
An Emotion Model for Music Using Brain Waves
TLDR
This work uses an electroencephalograph to record the subject’s reaction to music and emotion spectrum analysis method is used to analyse the electric potentials and provide continuousvalued annotations of four emotional states for different segments of the music. Expand
Using Convolutional Neural Networks to Recognize Rhythm Stimuli from Electroencephalography Recordings
TLDR
Convolutional neural networks are applied to analyze and classify EEG data recorded within a rhythm perception study in Kigali, Rwanda which comprises 12 East African and 12 Western rhythmic stimuli - each presented in a loop for 32 seconds to 13 participants. Expand
Attentional Selection in a Cocktail Party Environment Can Be Decoded from Single-Trial EEG.
TLDR
It is shown that single-trial unaveraged EEG data can be decoded to determine attentional selection in a naturalistic multispeaker environment and a significant correlation between the EEG-based measure of attention and performance on a high-level attention task is shown. Expand
Shared mechanisms in perception and imagery of auditory accents
TLDR
Results show that detection of imagined accents is possible and reveal similarity in brain signatures relevant to distinction of accents from non-accents in perception and imagery. Expand
Auditory imagery: empirical findings.
  • T. Hubbard
  • Psychology, Medicine
  • Psychological bulletin
  • 2010
The empirical literature on auditory imagery is reviewed. Data on (a) imagery for auditory features (pitch, timbre, loudness), (b) imagery for complex nonverbal auditory stimuli (musical contour,Expand
Neuroimaging Methods for Music Information Retrieval: Current Findings and Future Prospects
TLDR
It is shown that certain approaches currently used in a neuroscientific setting align with those used in MIR research, and implications for potential areas of future research are discussed. Expand
...
1
2
3
4
...