• Corpus ID: 238743938

Exploring multiple scales and bi-hemispheric asymmetric EEG features for emotion recognition

  title={Exploring multiple scales and bi-hemispheric asymmetric EEG features for emotion recognition},
  author={Yihan Wu and Min Xia and Huan Cai and Li Nie and Yangsong Zhang},
In recent years, emotion recognition based on electroencephalography (EEG) has received growing interest in the brain-computer interaction (BCI) field. The neuroscience researches indicate that the left and right brain hemispheres demonstrate differences under different emotional activities, which is an important principle for designing deep learning (DL) model for emotion recognition. Besides, the neural activities of emotions may occur at different time scales and duration, it is beneficial… 

Figures and Tables from this paper


EEG-Based Emotion Recognition by Convolutional Neural Network with Multi-Scale Kernels
The 2D CNN with different kernel-size of convolutional layers assembled into a convolution block, combining features that were distributed in small and large regions was proposed, combining feature categories that could display the intrinsic properties of an individual signal or a group of them.
Emotion Recognition from Multi-Channel EEG through Parallel Convolutional Recurrent Neural Network
Results indicate that the proposed pre-processing method can increase emotion recognition accuracy by 32% approximately and the model achieves a high performance with a mean accuracy of 90.80% and 91.03% on valence and arousal classification task respectively.
A Bi-Hemisphere Domain Adversarial Neural Network Model for EEG Emotion Recognition
A novel neural network model, called bi-hemisphere domain adversarial neural network (BiDANN), inspired by the neuroscience findings that the left and right hemispheres of human's brain are asymmetric to the emotional response, and an improved version, denoted by BiDANN-S, for subject-independent EEG emotion recognition problem.
A novel convolutional neural networks for emotion recognition based on EEG signal
An end-to-end model which is based on Convolutional Neural Networks (CNNs), in order to represent the EEG signals better, the original channels of EEG are firstly rearranged by Pearson Correlation Coefficient and the rearranged EEGs are fed into CNN.
EEG Based Emotion Recognition by Combining Functional Connectivity Network and Local Activations
Both information propagation patterns and activation difference in the brain were fused to improve the performance of emotional recognition to develop the effective human–computer interaction systems by adapting to human emotions in the real world applications.
Multi-Scale Neural Network for EEG Representation Learning in BCI
A novel deep multi-scale neural network that discovers feature representations in multiple frequency/time ranges and extracts relationships among electrodes, i.e., spatial representations, for subject intention/condition identification is proposed.
Correlated Attention Networks for Multimodal Emotion Recognition
In experiments on 3 real world datasets, this model can significantly contribute to higher emotion classification accuracy when higher correlation is acquired and performs better than the state-of-the-art methods.
From Regional to Global Brain: A Novel Hierarchical Spatial-Temporal Neural Network Model for EEG Emotion Recognition
The proposed method, denoted by R2G-STNN, consists of spatial and temporal neural network models with regional to global hierarchical feature learning process to learn discriminative spatial-temporal EEG features.
Multimodal Physiological Signal Emotion Recognition Based on Convolutional Recurrent Neural Network
The experimental results show that the convolutional recurrent neural network based method that is proposed efficiently extract multi-modal physiological signal feature to improve the emotion recognition performance.