Learn More
In recent years, there are many great successes in using deep architectures for unsupervised feature learning from data, especially for images and speech. In this paper, we introduce recent advanced deep learning models to classify two emotional categories (positive and negative) from EEG data. We train a deep belief network (DBN) with differential entropy(More)
This paper presents a new emotion recognition method which combines electroencephalograph (EEG) signals and pupillary response collected from eye tracker. We select 15 emotional film clips of 3 categories (positive, neutral and negative). The EEG signals and eye tracking data of five participants are recorded, simultaneously, while watching these videos. We(More)
This study aims at finding the relationship between EEG signals and human emotional states. Movie clips are used as stimuli to evoke positive, neutral and negative emotions of subjects. We introduce a new effective classifier named discriminative graph regularized extreme learning machine (GELM) for EEG-based emotion recognition. The average classification(More)
— Fatigue is a status of human brain activities, and driving fatigue detection is a topic of great interest all over the world. In this paper, we propose a measure of fatigue produced by eye tracking glasses, and use it as the ground truth to evaluate driving fatigue detection algorithms. Particularly, PERCLOS, which is the percentage of eye closure over(More)
In this paper, we first build up an electroencephalogram (EEG)-based driving fatigue detection system, and then propose a subject transfer framework for this system via component analysis. We apply a subspace projecting approach called transfer component analysis (TCA) for subject transfer. The main idea is to learn a set of transfer components underlying(More)
In this paper, we adopt a multimodal emotion recognition framework by combining eye movements and electroencephalography (EEG) to enhance emotion recognition. The main contributions of this paper are twofold. a) We investigate sixteen eye movements related to emotions and identify the intrinsic patterns of these eye movements for three emotional states:(More)
Addressing the structural and functional variability between subjects for robust affective brain-computer interface (aBCI) is challenging but of great importance, since the calibration phase for aBCI is time-consuming. In this paper, we propose a subject transfer framework for electroencephalogram (EEG)-based emotion recognition via component analysis. We(More)
— Various studies have shown that the traditional electrooculograms (EOGs) are effective for driving fatigue detection. However, the electrode placement of the traditional EOG recording method is around eyes, which may disturb the subjects' activities, and is not convenient for practical applications. To deal with this problem, we propose a novel electrode(More)
EEG signals, which can record the electrical activity along the scalp, provide researchers a reliable channel for investigating human emotional states. In this paper, a new algorithm, manifold regularized extreme learning machine (MRELM), is proposed for recognizing human emotional states (positive, neutral and negative) from EEG data, which were previously(More)