Identifying music-induced emotions from EEG for use in brain-computer music interfacing

  title={Identifying music-induced emotions from EEG for use in brain-computer music interfacing},
  author={Ian Daly and Asad Malik and James Weaver and Faustina Hwang and Slawomir Jaroslaw Nasuto and Duncan A. H. Williams and Alexis Kirke and Eduardo Reck Miranda},
  journal={2015 International Conference on Affective Computing and Intelligent Interaction (ACII)},
  • I. DalyAsad Malik E. Miranda
  • Published 21 September 2015
  • Computer Science
  • 2015 International Conference on Affective Computing and Intelligent Interaction (ACII)
Brain-computer music interfaces (BCMI) provide a method to modulate an individuals affective state via the selection or generation of music according to their current affective state. Potential applications of such systems may include entertainment of therapeutic applications. We outline a proposed design for such a BCMI and seek a method for automatically differentiating different music induced affective states. Band-power features are explored for use in automatically identifying music… 

Figures and Tables from this paper

Affective brain–computer music interfacing

This system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to user's affective states and possible applications include use in music therapy and entertainment.

Affective brain Computer Music Interfacing: a Case Study of Use by an Individualwith Huntington's disease

An affective brain-computer music interface (aBCMI), developed for use as an aid to music therapy, is trialled in a case study with an individual with Huntington’s disease, demonstrating that there is some potential for aBCMI systems to be used by individuals with Huntington's disease.

Assessment of Emotional States Through Physiological Signals and Its Application in Music Therapy for Disabled People

The device has generated a lot of interest among educators since it outperforms the state-of-the-art techniques allowing the integration of the most affected users by eliminating the aforementioned barrier because of non-controlled movements while assessing the emotional state of the person when facing the activity.

BEAMERS: Brain-Engaged, Active Music-based Emotion Regulation System

A novel music-based emotion regulation system with a commercial EEG device is designed without employing deterministic emotion recognition models for daily usage, and people feel easier to report their emotion variation comparing with absolute emotional states.

Brain Computer Interface based Emotion Recognition Using Fuzzy Logic

  • Computer Science
  • 2019
The intention of the project work is to develop a robot that can assist the disabled people in their daily life to do some work independent on others.

AI-Based Affective Music Generation Systems: A Review of Methods, and Challenges

The main building blocks of an AI-AMG system are discussed, and existing systems are formally categorized based on the core algorithm used for music generation.

Entropy-Assisted Multi-Modal Emotion Recognition Framework Based on Physiological Signals

This paper aims to improve the performance of emotion recognition by conducting the complexity analysis of physiological signals by extracting several entropy-domain features from ECG and GSR signals and selected XGBoost model to predict emotion and get 68% accuracy in arousal and 84% in valence.

A Systematic Review for Human EEG Brain Signals Based Emotion Classification, Feature Extraction, Brain Condition, Group Comparison

A systematic review of academic articles published within the mentioned scope to map and draw the research scenery for EEG human emotion into a taxonomy recognizes the main characteristics of this promising area of science.

“Hello Computer, How Am I Feeling?”, Case Studies of Neural Technology to Measure Emotions

Emotion is a core part of the human experience. Many artistic and creative applications attempt to produce particular emotional experiences, for example, films, games, music, dance, and other visual

EEG artifact correction strategies for online trial-by-trial analysis

The results show that ASR and EMD are both able to reveal a significant MMN and its modulation by predictability, and even appear more sensitive than the offline analysis when comparing alternative models of perception underlying auditory evoked responses.



EEG-Based Emotion Recognition in Music Listening

This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening to identify 30 subject-independent features that were most relevant to emotional processing across subjects and explored the feasibility of using fewer electrodes to characterize the EEG dynamics duringMusic listening.

Brain-computer music interface for composition and performance

A new brain-computer interface (BCI) system that uses electroencephalogram (EEG) information to steer generative rules to compose and perform music and employs its own adapted version of a machine-learning technique based on ATNs for the computer-replication of musical styles.

EEG-based classification of positive and negative affective states

This study aimed to identify the neurophysiological correlates of two primary aroused affective states related to positive and negative emotions, and to create a classification model for each second

Comparison of linear, nonlinear, and feature selection methods for EEG signal classification

The results of a linear (linear discriminant analysis) and two nonlinear classifiers applied to the classification of spontaneous EEG during five mental tasks are reported, showing that non linear classifiers produce only slightly better classification results.

Affective state recognition from EEG with deep belief networks

  • Kang LiXiaoyi LiYuan ZhangA. Zhang
  • Computer Science
    2013 IEEE International Conference on Bioinformatics and Biomedicine
  • 2013
A novel Deep Belief Networks (DBN) based model for affective state recognition from EEG signals that can successfully handle the aforementioned two challenges and significantly outperform the baselines is proposed.

Continuous emotion detection using EEG signals and facial expressions

For the first time, this paper continuously detect valence from electroencephalogram (EEG) signals and facial expressions in response to videos and the results of multimodal fusion between facial expression and EEG signals are presented.

Brain-Computer Music Interfacing (BCMI): From Basic Research to the Real World of Special Needs

It is proved the concept that such a BCMI system is cost-effective to build, viable, and useful, however, ergonomic and design aspects of the system require further refinement in order to make it more practical for clinical usage.

Valence, arousal and dominance in the EEG during game play

The investigation of traces of naturally occurring emotions in electrical brain signals, that can be used to build interfaces that respond to the authors' emotional state, confirms a number of known affective correlates in a realistic, uncontrolled environment for the emotions of valence, arousal and dominance.

A Bioelectric Controller for Computer Music Applications

A special purpose signal processing computer designed to acquire lowlevel neuroelectric and myoelectric signals, perform feature extraction on these signals, and then map the desired features to MIDI commands-all in real-time is described.

FORCe: Fully Online and Automated Artifact Removal for Brain-Computer Interfacing

The method outperforms the state-of the-art automated artifact removal methods Lagged Auto-Mutual Information Clustering and Fully Automated Statistical Thresholding for EEG artifact Rejection (FASTER), and is able to remove a wide range of artifact types including blink, electromyogram (EMG), and electrooculogram (EOG) artifacts.