• Corpus ID: 239024404

Positional-Spectral-Temporal Attention in 3D Convolutional Neural Networks for EEG Emotion Recognition

  title={Positional-Spectral-Temporal Attention in 3D Convolutional Neural Networks for EEG Emotion Recognition},
  author={Jiyao Liu and Yanxin Zhao and Hao Wu and Dongmei Jiang},
Recognizing the feelings of human beings plays a critical role in our daily communication. Neuroscience has demonstrated that different emotion states present different degrees of activation in different brain regions, EEG frequency bands and temporal stamps. In this paper, we propose a novel structure to explore the informative EEG features for emotion recognition. The proposed module, denoted by PST-Attention, consists of Positional, Spectral and Temporal Attention modules to explore more… 
1 Citations

Figures and Tables from this paper

Transformers for EEG Emotion Recognition
This paper presents a novel approach to EEG emotion recognition built exclusively on self-attention over the spectrum, space, and time dimensions to explore the contribution of different EEG electrodes and temporal slices to specific emotional states.


4D attention-based neural network for EEG emotion recognition
This paper presents a novel method, called four-dimensional attention-based neural network (4D-aNN) for EEG emotion recognition, which achieves state-of-the-art performances on both DEAP, SEED and SEED-IV datasets under intra-subject splitting.
A Hierarchical Bidirectional GRU Model With Attention for EEG-Based Emotion Classification
The proposed GRU network with attention for human emotion classification from continues electroencephalogram (EEG) signals shows more robust classification performance than baseline models and can effectively reduce the impact of long-term non-stationarity of EEG sequences and improve the accuracy and robustness of EEG-based emotion classification.
EEG-Based Emotion Recognition using 3D Convolutional Neural Networks
In this approach, the use of the 3-Dimensional Convolutional Neural Networks (3D-CNN) is investigated using a multi-channel EEG data for emotion recognition and it is found that the proposed method is able to achieve recognition accuracies outperforming the state of the art methods.
Hierarchical Convolutional Neural Networks for EEG-Based Emotion Recognition
Benefiting from the strong representational learning capacity in the two-dimensional space, HCNN is efficient in emotion recognition especially on Beta and Gamma waves.
Sparse Graphic Attention LSTM for EEG Emotion Recognition
A novel multichannel EEG emotion recognition method based on sparse graphic attention long short-term memory (SGA-LSTM) is proposed, which is superior to the state-of-the-art methods.
EEG-Based Emotion Recognition Using Regularized Graph Neural Networks
A regularized graph neural network for EEG-based emotion recognition that considers the biological topology among different brain regions to capture both local and global relations among different EEG channels and ablation studies show that the proposed adjacency matrix and two regularizers contribute consistent and significant gain to the performance of the model.
Emotion Recognition based on EEG using LSTM Recurrent Neural Network
A deep learning method is proposed to recognize emotion from raw EEG signals using Long-Short Term Memory (LSTM) and the dense layer classifies these features into low/high arousal, valence, and liking.
EEG-Based Emotion Classification Using Long Short-Term Memory Network with Attention Mechanism
A long short-term memory network is proposed to consider changes in emotion over time and apply an attention mechanism to assign weights to the emotional states appearing at specific moments based on the peak–end rule in psychology.
EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks
The proposed DGCNN method can dynamically learn the intrinsic relationship between different electroencephalogram (EEG) channels via training a neural network so as to benefit for more discriminative EEG feature extraction.
Investigating EEG-Based Functional Connectivity Patterns for Multimodal Emotion Recognition
A novel emotion-relevant critical subnetwork selection algorithm is proposed and three EEG functional connectivity network features are investigated: strength, clustering coefficient, and eigenvector centrality and reveal that distinct functional connectivity patterns are exhibited for the five emotions of disgust, fear, sadness, happiness, and neutrality.