DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-cost Off-the-Shelf Devices

@article{Katsigiannis2018DREAMERAD,
  title={DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-cost Off-the-Shelf Devices},
  author={Stamos Katsigiannis and Naeem Ramzan},
  journal={IEEE Journal of Biomedical and Health Informatics},
  year={2018},
  volume={22},
  pages={98-107}
}
In this paper, we present DREAMER, a multimodal database consisting of electroencephalogram (EEG) and electrocardiogram (ECG) signals recorded during affect elicitation by means of audio-visual stimuli. [...] Key Method All the signals were captured using portable, wearable, wireless, low-cost, and off-the-shelf equipment that has the potential to allow the use of affective computing methods in everyday applications.Expand
EDOSE: Emotion Datasets from Open Source EEG with a Real-Time Bracelet Sensor
TLDR
The experimental results for the proposed datasets or EDOSE outperformed those from the state-of-the-art EEG datasets in a study of affective computing, namely DEAP, MAHNOB-HCI, DECAF and DREAMER.
Consumer Grade Brain Sensing for Emotion Recognition
TLDR
OpenBCI is evaluated by first comparing its performance to research grade EEG system, employing the same algorithms that were applied on benchmark datasets and a novel method to facilitate the selection of audio-visual stimuli of high/low valence and arousal is proposed for emotion classification.
Deep Neural Network for Electroencephalogram based Emotion Recognition
Emotion recognition using electroencephalogram (EEG) signals is an aspect of affective computing. The EEG refers to recording brain responses via electrical signals by showing external stimuli to the
CNN and LSTM-Based Emotion Charting Using Physiological Signals
TLDR
A strong correlation between spectral- and hidden-layer feature analysis with classification performance suggests the efficacy of the proposed method for significant feature extraction and higher emotion elicitation performance to a broader context for less constrained environments.
MPED: A Multi-Modal Physiological Emotion Database for Discrete Emotion Recognition
TLDR
A multi-modal physiological emotion database is designed and built, which collects four modal physiological signals, i.e., electroencephalogram (EEG), galvanic skin response, respiration, and electrocardiogram (ECG), and a novel attention-long short-term memory (A-LSTM), which strengthens the effectiveness of useful sequences to extract more discriminative features.
Decoding the Neural Signatures of Valence and Arousal From Portable EEG Headset
TLDR
This paper focuses on classifying emotions on the valence-arousal plane using various feature extraction, feature selection and machine learning techniques, and proposes the optimal set of features and electrodes for emotion recognition.
A review of recent approaches for emotion classification using electrocardiography and electrodermography signals
TLDR
The main objective of this paper is to analyze the current trends in terms of how signals including heart rate and skin conductance can be used as training features for machine learning classifiers to perform the emotion classification task.
Affective recognition from EEG signals: an integrated data-mining approach
TLDR
Investigating the accuracy and applicability of previous affective recognition methods on data collected with an Emotiv headset while participants used a personal computer to fulfill several tasks demonstrates that such methods can be used to accurately detect emotions using a small EEG headset during a normal daily activity.
Recognizing Emotional States With Wearables While Playing a Serious Game
TLDR
The results suggest that EEG and EOG biosignals, as well as kinematic motion data acquired using off-the-shelf wearable sensors in combination with machine-learning techniques such as EC, can be used to classify emotional states, while the individuals were playing the Whack-a-Mole game.
An EEG Database and Its Initial Benchmark Emotion Classification Performance
TLDR
A database consisting of EEG signals of 44 volunteers consisting of four types of emotions based on discrete wavelet transform and extreme learning machine (ELM) for reporting the initial benchmark classification performance is presented.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 44 REFERENCES
DEAP: A Database for Emotion Analysis ;Using Physiological Signals
TLDR
A multimodal data set for the analysis of human affective states was presented and a novel method for stimuli selection is proposed using retrieval by affective tags from the last.fm website, video highlight detection, and an online assessment tool.
DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses
TLDR
DECAF is presented, a detailed analysis of the correlations between participants' self-assessments and their physiological responses and single-trial classification results for valence, arousal and dominance are presented, with performance evaluation against existing data sets.
Multimedia implicit tagging using EEG signals
  • M. Soleymani, M. Pantic
  • Computer Science
    2013 IEEE International Conference on Multimedia and Expo (ICME)
  • 2013
TLDR
Electroencephalogram (EEG) signals reflect brain activities associated with emotional and cognitive processes and can be used to find tags for multimedia content without users' direct input.
From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification
TLDR
This paper discussed the most important stages of a fully implemented emotion recognition system including data analysis and classification, and used a music induction method which elicits natural emotional reactions from the subject.
A Multimodal Database for Affect Recognition and Implicit Tagging
TLDR
Results show the potential uses of the recorded modalities and the significance of the emotion elicitation protocol and single modality and modality fusion results for both emotion recognition and implicit tagging experiments are reported.
Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection
TLDR
The effect of the contamination of facial muscle activities on EEG signals is analyzed and it is found that most of the emotionally valuable content in EEG features are as a result of this contamination, however, the statistical analysis showed that EEG signals still carry complementary information in presence of facial expressions.
EEG-based Emotion Recognition The Influence of Visual and Auditory Stimuli
Making the computer more empathic to the user is one of the aspects of affective computing. With EEG-based emotion recognition, the computer can actually take a look inside the user’s head to observe
A 3-D Audio-Visual Corpus of Affective Communication
TLDR
This work presents a new audio-visual corpus for possibly the two most important modalities used by humans to communicate their emotional states, namely speech and facial expression in the form of dense dynamic 3-D face geometries.
Validation of the Emotiv EPOC® EEG gaming system for measuring research quality auditory ERPs
TLDR
The gaming EEG system may prove a valid alternative to laboratory ERP systems for recording reliable late auditory ERPs over the frontal cortices, and less reliable ERPs, such as the MMN, if the reliability of such ERPs can be boosted to the same level as late auditoryERPs.
Classifying Affective States Using Thermal Infrared Imaging of the Human Face
  • B. Nhan, T. Chau
  • Psychology, Computer Science
    IEEE Transactions on Biomedical Engineering
  • 2010
TLDR
The results of this study suggest that classification of facial thermal infrared imaging data coupled with affect models can be used to provide information about an individual's affective state for potential use as a passive communication pathway.
...
1
2
3
4
5
...