A Transformer Architecture for Stress Detection from ECG

@article{Behinaein2021ATA,
  title={A Transformer Architecture for Stress Detection from ECG},
  author={Behnam Behinaein and Anubha Bhatti and Dirk Rodenburg and Paul C. Hungler and Ali Etemad},
  journal={2021 International Symposium on Wearable Computers},
  year={2021}
}
Electrocardiogram (ECG) has been widely used for emotion recognition. This paper presents a deep neural network based on convolutional layers and a transformer mechanism to detect stress using ECG signals. We perform leave-one-subject-out experiments on two publicly available datasets, WESAD and SWELL-KW, to evaluate our method. Our experiments show that the proposed model achieves strong results, comparable or better than the state-of-the-art models for ECG-based stress detection on these two… 

Figures and Tables from this paper

Transformer-Based Self-Supervised Learning for Emotion Recognition
TLDR
This work proposes to use a Transformer-based model to process electrocardiograms (ECG) for emotion recognition with state-of-the-art performances using ECG signals on AMIGOS and shows that transformers and pre-training are promising strategies for emotions recognition with physiological signals.
T RANSFORMER -B ASED S ELF -S UPERVISED L EARNING FOR E MOTION R ECOGNITION
TLDR
This work proposes to use a Transformer-based model to process electrocardiograms (ECG) for emotion recognition with state-of-the-art performances using ECG signals on AMIGOS and shows that transformers and pre-training are promising strategies for emotions recognition with physiological signals.
Mobile Emotion Recognition via Multiple Physiological Signals using Convolution-augmented Transformer
TLDR
A novel end-to-end emotion recognition system based on a convolution-augmented transformer architecture that can recognise users' emotions on the dimensions of arousal and valence by learning both the global and local fine-grained associations and dependencies within and across multimodal physiological data.
AttX: Attentive Cross-Connections for Fusion of Wearable Signals in Emotion Recognition
TLDR
This work proposes cross-modal attentive connections, a new dynamic and effective technique for multimodal representation learning from wearable data that can effectively regulate and share information between different modalities to learn better representations.
GeoECG: Data Augmentation via Wasserstein Geodesic Perturbation for Robust Electrocardiogram Prediction
TLDR
A physiologically-inspired data augmentation method to improve performance and increase the robustness of heart disease detection based on ECG signals and demonstrates improvements in accuracy and robustness.
Optimal Transport based Data Augmentation for Heart Disease Diagnosis and Prediction
TLDR
A MultiFeature Transformer (MF-Transformer) is built as a classification model, where different features are extracted from both time and frequency domains to diagnose various heart conditions, and is able to distinguish five categories of cardiac conditions.

References

SHOWING 1-10 OF 26 REFERENCES
Constrained transformer network for ECG signal processing and arrhythmia classification
TLDR
An end-to-end deep learning framework based on convolutional neural network is proposed for ECG signal processing and arrhythmia classification that can help cardiologists perform assisted diagnosis of heart disease and improve the efficiency of healthcare delivery.
Self-Supervised Learning for ECG-Based Emotion Recognition
  • Pritam Sarkar, A. Etemad
  • Computer Science
    ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2020
TLDR
The proposed method outperforms the state-of-the-art in ECG-based emotion recognition with two publicly available datasets, SWELL and AMIGOS, and highlights the advantage of the self-supervised approach in requiring significantly less data to achieve acceptable results.
Self-supervised ECG Representation Learning for Emotion Recognition
TLDR
The weights of the self-supervised network are transferred to an emotion recognition network, where the convolutional layers are kept frozen and the dense layers are trained with labelled ECG data, and it is shown that the proposed solution considerably improves the performance compared to a network trained using fully-supervision.
Deep ECGNet: An Optimal Deep Learning Framework for Monitoring Mental Stress Using Ultra Short-Term ECG Signals.
TLDR
This study proposes a "novel" approach to stress recognition using electrocardiogram signals that avoids the intractable long-term heart rate variability (HRV) parameter extraction process.
Stress Detection in Working People
Comparing features from ECG pattern and HRV analysis for emotion recognition system
TLDR
The proposed features represent the statistical distribution of dominant frequencies, calculated using spectrogram analysis of intrinsic mode function after applying the bivariate empirical mode decomposition to ECG, and offer a promising approach to emotion recognition based on short ECG signals.
Introducing WESAD, a Multimodal Dataset for Wearable Stress and Affect Detection
TLDR
This work introduces WESAD, a new publicly available dataset for wearable stress and affect detection that bridges the gap between previous lab studies on stress and emotions, by containing three different affective states (neutral, stress, amusement).
Heart Rate Variability Signal Features for Emotion Recognition by Using Principal Component Analysis and Support Vectors Machine
TLDR
The results showed the feasibility of daily emotion monitoring by using extracted HRV features and SVM classifier, which is lower slightly than other studies.
Emotion Assessment Using Feature Fusion and Decision Fusion Classification Based on Physiological Data: Are We There Yet?
TLDR
The experimental results showed that FF is the most competitive technique in terms of classification accuracy and computational complexity.
An Explainable Deep Fusion Network for Affect Recognition Using Physiological Signals
TLDR
A deep learning model is proposed to process multimodal-multisensory bio-signals for affect recognition that supports batch training for different sampling rate signals at the same time, and the results show significant improvement compared to the state of the art.
...
...