| Literature DB >> 34902609 |
Rui Li1, Chao Ren2, Xiaowei Zhang3, Bin Hu4.
Abstract
Emotion recognition is a vital but challenging step in creating passive brain-computer interface applications. In recent years, many studies on electroencephalogram (EEG)-based emotion recognition have been conducted. Ensemble learning has been widely used in emotion recognition because of its superior accuracy and generalization. In this study, we proposed a novel ensemble learning method based on multiple objective particle swarm optimization for subject-independent EEG-based emotion recognition. First, we used a 4 s sliding time window with a 2 s overlap to extract 13 different features from EEG signals and construct a feature vector. Then, we employed L1 regularization to select effective features. Second, a model selection method was applied to choose the optimal basic analysis submodels. Afterward, we proposed an ensemble operator that converts the classification results of a single model from discrete values to continuous values to better characterize the classification results. Subsequently, multiple objective particle swarm optimization was adopted to confirm the optimal parameters of the ensemble learning model. Finally, we conducted extensive experiments on two public datasets: DEAP and SEED. Considering the generalization of the model, we applied leave-one-subject-out cross-validation to evaluate the performance of the model. The experimental results demonstrate that the proposed method achieves a better recognition performance than single methods, commonly used ensemble learning methods, and state-of-the-art methods. The average accuracies for arousal and valence are 65.70% and 64.22%, respectively, on the DEAP database, and the average accuracy on the SEED database is 84.44%.Entities:
Keywords: EEG; Emotion recognition; Ensemble learning; Multiple objective particle swarm optimization
Year: 2021 PMID: 34902609 DOI: 10.1016/j.compbiomed.2021.105080
Source DB: PubMed Journal: Comput Biol Med ISSN: 0010-4825 Impact factor: 4.589