DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-cost Off-the-Shelf Devices

@article{Katsigiannis2018DREAMERAD,
  title={DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-cost Off-the-Shelf Devices},
  author={Stamos Katsigiannis and Naeem Ramzan},
  journal={IEEE Journal of Biomedical and Health Informatics},
  year={2018},
  volume={22},
  pages={98-107}
}
In this paper, we present DREAMER, a multimodal database consisting of electroencephalogram (EEG) and electrocardiogram (ECG) signals recorded during affect elicitation by means of audio-visual stimuli. [] Key Method All the signals were captured using portable, wearable, wireless, low-cost, and off-the-shelf equipment that has the potential to allow the use of affective computing methods in everyday applications.

Figures and Tables from this paper

EDOSE: Emotion Datasets from Open Source EEG with a Real-Time Bracelet Sensor

The experimental results for the proposed datasets or EDOSE outperformed those from the state-of-the-art EEG datasets in a study of affective computing, namely DEAP, MAHNOB-HCI, DECAF and DREAMER.

Consumer Grade Brain Sensing for Emotion Recognition

OpenBCI is evaluated by first comparing its performance to research grade EEG system, employing the same algorithms that were applied on benchmark datasets and a novel method to facilitate the selection of audio-visual stimuli of high/low valence and arousal is proposed for emotion classification.

High-wearable EEG-Based Detection of Emotional Valence for Scientific Measurement of Emotions

An emotional-valence detection method for a very–high wearable EEG-based system is proposed. Valence detection occurs along the interval scale theorized by the circumplex model of emotions. The

Decoding the Neural Signatures of Valence and Arousal From Portable EEG Headset

This paper focuses on classifying emotions on the valence-arousal plane using various feature extraction, feature selection and machine learning techniques and proposes the optimal set of features and electrodes for emotion recognition.

Deep Neural Network for Electroencephalogram based Emotion Recognition

This paper proposes the prediction of valence, arousal, dominance and liking for EEG signals using a deep neural network (DNN) and the result of combined features with leaky ReLU is found to be the best.

A Survey on EEG-Based Solutions for Emotion Recognition With a Low Number of Channels

This paper presents a review of the studies exploiting a number of channels less than 16 for electroencephalographic (EEG) based-emotion recognition, and considers the most informative channels for the valence dimension, according to both data-driven and neurophysiological prior knowledge approaches.

Emotion Recognition Using a Reduced Set of EEG Channels Based on Holographic Feature Maps

Holographic features are implemented, electrode selection is investigated, and the channel selection methods improve emotion recognition rates significantly with an accuracy of 90.76% for valence, 92.92% for arousal, and 92.97% for dominance.

CNN and LSTM-Based Emotion Charting Using Physiological Signals

A strong correlation between spectral- and hidden-layer feature analysis with classification performance suggests the efficacy of the proposed method for significant feature extraction and higher emotion elicitation performance to a broader context for less constrained environments.

MPED: A Multi-Modal Physiological Emotion Database for Discrete Emotion Recognition

A multi-modal physiological emotion database is designed and built, which collects four modal physiological signals, i.e., electroencephalogram (EEG), galvanic skin response, respiration, and electrocardiogram (ECG), and a novel attention-long short-term memory (A-LSTM), which strengthens the effectiveness of useful sequences to extract more discriminative features.

On the benefits of using Hidden Markov Models to predict emotions

This work describes a technique to predict emotional states in Russell’s two-dimensional emotion space (valence and arousal), using electroencephalography, electrocardiography, and electromyography signals, using Hidden Markov Models.
...

References

SHOWING 1-10 OF 44 REFERENCES

DEAP: A Database for Emotion Analysis ;Using Physiological Signals

A multimodal data set for the analysis of human affective states was presented and a novel method for stimuli selection is proposed using retrieval by affective tags from the last.fm website, video highlight detection, and an online assessment tool.

DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses

DECAF is presented, a detailed analysis of the correlations between participants' self-assessments and their physiological responses and single-trial classification results for valence, arousal and dominance are presented, with performance evaluation against existing data sets.

Multimedia implicit tagging using EEG signals

  • M. SoleymaniM. Pantic
  • Computer Science
    2013 IEEE International Conference on Multimedia and Expo (ICME)
  • 2013
Electroencephalogram (EEG) signals reflect brain activities associated with emotional and cognitive processes and can be used to find tags for multimedia content without users' direct input.

From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Feature Extraction and Classification

This paper discussed the most important stages of a fully implemented emotion recognition system including data analysis and classification, and used a music induction method which elicits natural emotional reactions from the subject.

A Multimodal Database for Affect Recognition and Implicit Tagging

Results show the potential uses of the recorded modalities and the significance of the emotion elicitation protocol and single modality and modality fusion results for both emotion recognition and implicit tagging experiments are reported.

Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection

The effect of the contamination of facial muscle activities on EEG signals is analyzed and it is found that most of the emotionally valuable content in EEG features are as a result of this contamination, however, the statistical analysis showed that EEG signals still carry complementary information in presence of facial expressions.

EEG-based Emotion Recognition The Influence of Visual and Auditory Stimuli

A research project to recognize emotion from brain signals measured with the BraInquiry EEG PET device by establishing a suitable approach and determining optimal placement of a limited number of electrodes for emotion recognition.

A 3-D Audio-Visual Corpus of Affective Communication

This work presents a new audio-visual corpus for possibly the two most important modalities used by humans to communicate their emotional states, namely speech and facial expression in the form of dense dynamic 3-D face geometries.

Validation of the Emotiv EPOC® EEG gaming system for measuring research quality auditory ERPs

The gaming EEG system may prove a valid alternative to laboratory ERP systems for recording reliable late auditory ERPs over the frontal cortices, and less reliable ERPs, such as the MMN, if the reliability of such ERPs can be boosted to the same level as late auditoryERPs.

Classifying Affective States Using Thermal Infrared Imaging of the Human Face

  • B. NhanT. Chau
  • Psychology
    IEEE Transactions on Biomedical Engineering
  • 2010
The results of this study suggest that classification of facial thermal infrared imaging data coupled with affect models can be used to provide information about an individual's affective state for potential use as a passive communication pathway.