| Literature DB >> 28467371 |
Xin Chai1, Qisong Wang2, Yongping Zhao3, Yongqiang Li4, Dan Liu5, Xin Liu6, Ou Bai7.
Abstract
Electroencephalography (EEG)-based emotion recognition is an important element in psychiatric health diagnosis for patients. However, the underlying EEG sensor signals are always non-stationary if they are sampled from different experimental sessions or subjects. This results in the deterioration of the classification performance. Domain adaptation methods offer an effective way to reduce the discrepancy of marginal distribution. However, for EEG sensor signals, both marginal and conditional distributions may be mismatched. In addition, the existing domain adaptation strategies always require a high level of additional computation. To address this problem, a novel strategy named adaptive subspace feature matching (ASFM) is proposed in this paper in order to integrate both the marginal and conditional distributions within a unified framework (without any labeled samples from target subjects). Specifically, we develop a linear transformation function which matches the marginal distributions of the source and target subspaces without a regularization term. This significantly decreases the time complexity of our domain adaptation procedure. As a result, both marginal and conditional distribution discrepancies between the source domain and unlabeled target domain can be reduced, and logistic regression (LR) can be applied to the new source domain in order to train a classifier for use in the target domain, since the aligned source domain follows a distribution which is similar to that of the target domain. We compare our ASFM method with six typical approaches using a public EEG dataset with three affective states: positive, neutral, and negative. Both offline and online evaluations were performed. The subject-to-subject offline experimental results demonstrate that our component achieves a mean accuracy and standard deviation of 80.46% and 6.84%, respectively, as compared with a state-of-the-art method, the subspace alignment auto-encoder (SAAE), which achieves values of 77.88% and 7.33% on average, respectively. For the online analysis, the average classification accuracy and standard deviation of ASFM in the subject-to-subject evaluation for all the 15 subjects in a dataset was 75.11% and 7.65%, respectively, gaining a significant performance improvement compared to the best baseline LR which achieves 56.38% and 7.48%, respectively. The experimental results confirm the effectiveness of the proposed method relative to state-of-the-art methods. Moreover, computational efficiency of the proposed ASFM method is much better than standard domain adaptation; if the numbers of training samples and test samples are controlled within certain range, it is suitable for real-time classification. It can be concluded that ASFM is a useful and effective tool for decreasing domain discrepancy and reducing performance degradation across subjects and sessions in the field of EEG-based emotion recognition.Entities:
Keywords: Electroencephalography (EEG); domain adaptation; emotion recognition
Mesh:
Year: 2017 PMID: 28467371 PMCID: PMC5469537 DOI: 10.3390/s17051014
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1The flowchart of the Electroencephalography (EEG) emotion recognition system integrating the proposed domain adaptation method.
Figure 2Protocol for the electroencephalography-based emotion recognition experiment (adapted from [13]).
Figure 3Architecture of the fast marginal distribution adaptation strategy. PCA: principal component analysis.
Details of parameters used in different methods. SVM: support vector machine; LR: logistic regression; AE: auto-encode; TSC: transfer sparse coding; TCA: transfer component analysis; TJM: transfer joint matching.
| Method | Parameter Details |
|---|---|
| SVM | Linear kernel; |
| LR | L2-regularized logistic regression; |
| AE | Structure with 2 hidden layers; |
| TSC | Number of basis vectors was 150; |
| TCA | Subspace bases was set as 80; |
| TJM | Subspace bases was set as 80; |
| ASFM | The dimension of the PCA subspace is set to all; |
Average classification accuracy and standard deviations (%) of the 15 subjects for each experimental session using leave-one-subject-out cross validation method. The reported results from the original papers are marked with *. SAAE: subspace alignment auto-encoder; SFM: subspace feature matching. (bold numbers indicate the best results).
| Task Group No. | Experimental Session 1 | Experimental Session 2 | Experimental Session 3 | Average |
|---|---|---|---|---|
| SVM | 60.31/9.53 | 53.37/12.97 | 58.20/12.40 | 57.29/7.42 |
| LR | 59.84/10.12 | 54.32/14.42 | 58.12/12.58 | 57.43/7.95 |
| AE [ | 65.64/10.44 | 55.82/11.52 | 63.67/12.31 | 61.71/8.09 |
| TSC [ | 73.33/7.61 | 72.16/9.08 | 67.03/9.79 | 70.84/6.44 |
| TCA [ | 75.91/11.52 | 74.19/12.26 | 75.87/6.87 | 75.32/11.07 |
| TJM [ | 77.62/10.90 | 74.30/12.06 | 76.79/5.69 | 76.24/10.38 |
| SAAE * [ | 80.22/8.00 | 74.68/12.76 | 78.73/12.96 | 77.88/7.33 |
| SFM | 80.34/6.13 | 74.81/10.51 | 77.74/10.15 | 77.63/5.84 |
| ASFM |
Average classification accuracy and standard deviations(%) of training and test data from different sessions. The reported results from the original papers are marked with *.(bold numbers indicate the best results).
| Task Group No. | Average | ||||||
|---|---|---|---|---|---|---|---|
| SVM | 67.63/12.73 | 64.21/13.00 | 67.62/14.73 | 70.37/19.48 | 72.69/14.21 | 66.64/14.38 | 68.19/11.41 |
| LR | 60.64/17.38 | 60.42/14.61 | 58.82/13.97 | 65.30/11.37 | 63.89/18.44 | 63.85/15.79 | 62.15/9.51 |
| GELM * [ | 72.55/10.29 | 67.22/10.42 | 75.86/7.71 | 76.62/15.34 | 76.28/11.47 | 78.17/13.41 | 74.45/8.20 |
| AE [ | 76.66/8.92 | 75.30/10.83 | 77.47/11.54 | 77.20/15.35 | 77.02/12.81 | 78.21/13.15 | 76.98/9.52 |
| TSC [ | 79.85/12.12 | 80.71/10.70 | 82.07/8.08 | 80.24/10.32 | 79.92/7.71 | 77.99/11.36 | 80.13/8.52 |
| TCA [ | 81.56/11.52 | 79.35/12.26 | 81.56/6.87 | 82.83/11.07 | 80.84/8.00 | 77.97/13.90 | 80.68/7.70 |
| TJM [ | 82.43/10.90 | 80.89/12.06 | 82.36/5.69 | 84.04/10.38 | 80.57/9.97 | 79.09/11.69 | 81.56/7.47 |
| SAAE * [ | 80.04/13.03 | 84.31/7.21 | 83.09/9.98 | 80.20/9.99 | 78.77/10.49 | 81.81/7.56 | |
| SFM | 83.09/10.12 | 81.61/10.42 | 84.11/6.23 | 84.21/8.19 | 84.62/9.59 | 82.16/11.06 | 83.30/7.12 |
| ASFM | 84.34/10.18 |
Figure 4The flow of online experiment.
Details of parameters used in different methods.
| Method | Parameter Details |
|---|---|
| SVM | Linear kernel; |
| LR | L2-regularized logistic regression; |
| SFM | The dimension of the PCA subspace is set to all. |
| ASFM | The dimension of the PCA subspace is set to all; |
Figure 5Average classification accuracy and standard deviations (%) of each subject using leave-one-subject-out cross validation method for on-line evaluation.
Figure 6Average classification accuracy and standard deviations (%) of training and test data from different sessions for on-line evaluation.
Time complexity of all baseline methods. The reported results from the original papers are marked with *.
| SVM | LR | AE | TSC | TCA | TJM | SAAE * | SFM | ASFM | |
|---|---|---|---|---|---|---|---|---|---|
| Training time ( | 0.20 | 0.29 | 90.14 | 246.39 | 58.08 | 164.92 | 121.81 | 0.39 | 1.39 |
Figure 7Average classification accuracy of the ASFM method with varying subspace dimension.
Time complexity of ASFM with varying subspace dimension.
| Subspace Dimension | 10 | 30 | 50 | 70 | 90 | 110 | 130 | 150 | 250 | 325 |
|---|---|---|---|---|---|---|---|---|---|---|
| Training time ( | 0.19 | 0.26 | 0.30 | 0.37 | 0.45 | 0.50 | 0.58 | 0.63 | 1.01 | 1.39 |
Figure 8Feature distributions of the first two dimensions between training and testing sessions for Subject 1.
Figure 9The profile of accuracy varying with time in session-to-session and subject-to-subject experiment for Subject 1.
Classification accuracy (%) of ASFM using different values of the threshold .
|
| 0.1 | 0.2 | 0.3 | 0.4 | 0.45 | 0.5 | 0.6 | 0.7 | 0.8 | 0.9 |
|---|---|---|---|---|---|---|---|---|---|---|
| Classification accuracy (%) | 77.45 | 77.48 | 78.23 | 79.49 | 80.46 | 79.73 | 78.31 | 77.95 | 77.82 | 77.71 |
Classification accuracy (%) of ASFM using different number of iterations.
| Number of Iterations | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
|---|---|---|---|---|---|---|---|---|
| Classification accuracy (%) | 77.63 | 80.46 | 80.69 | 80.90 | 81.05 | 81.08 | 81.09 | 81.09 |