| Literature DB >> 27313656 |
Yuliang Ma1, Xiaohui Ding1, Qingshan She1, Zhizeng Luo1, Thomas Potter2, Yingchun Zhang3.
Abstract
Support vector machines are powerful tools used to solve the small sample and nonlinear classification problems, but their ultimate classification performance depends heavily upon the selection of appropriate kernel and penalty parameters. In this study, we propose using a particle swarm optimization algorithm to optimize the selection of both the kernel and penalty parameters in order to improve the classification performance of support vector machines. The performance of the optimized classifier was evaluated with motor imagery EEG signals in terms of both classification and prediction. Results show that the optimized classifier can significantly improve the classification accuracy of motor imagery EEG signals.Entities:
Mesh:
Year: 2016 PMID: 27313656 PMCID: PMC4904086 DOI: 10.1155/2016/4941235
Source DB: PubMed Journal: Comput Math Methods Med ISSN: 1748-670X Impact factor: 2.238
Figure 1The timing chart of a single motor imagery experiment.
The experiments number of the five subjects.
| Subjects | Training times | Test times |
|---|---|---|
| aa | 168 | 112 |
| al | 224 | 56 |
| av | 84 | 196 |
| aw | 56 | 224 |
| ay | 28 | 252 |
Figure 2The timing chart of a single motor imagery experiment.
Figure 3The flowchart of PSO optimized SVM parameters.
Figure 4The fitness curve of the particle swarm optimization parameters.
Figure 5The classification accuracy figure before optimization.
Figure 6The classification accuracy figure after optimization.
The classification results of PSO-SVM and traditional methods for 2005 Data Iva.
| Accuracy (%) | aa (100) | al (200) | av (80) | aw (56) | ||||
|---|---|---|---|---|---|---|---|---|
| Max | Average | Max | Average | Max | Average | Max | Average | |
| Decision tree | 82.5 | 79.8 | 74.6 | 71.5 | 74.2 | 71.2 | 75.4 | 72.8 |
| BP | 71.4 | 70.2 | 85.8 | 84.2 | 81.6 | 79.4 | 72.2 | 70.6 |
| KNN | 94.6 | 92.5 | 87.6 | 85.8 | 82.4 | 80.1 | 89.2 | 87.8 |
| LDA | 96.2 | 95.4 | 92.7 | 90.2 | 82.8 | 80.2 | 91.6 | 90.2 |
| SVM | 97.7 | 96.4 | 92.5 | 89.4 | 81.5 | 77.6 | 90.5 | 88.9 |
| PSO-SVM | 98.1 | 97.0 | 93.9 | 91.7 | 82.0 | 80.5 | 92.3 | 91.6 |
Figure 7The classification accuracy of PSO-SVM compared with traditional methods for 2005 Data Iva.
The classification results of PSO-SVM and traditional methods for 2008 Dataset 1.
| Accuracy (%) | a | b | f | e | ||||
|---|---|---|---|---|---|---|---|---|
| Max | Average | Max | Average | Max | Average | Max | Average | |
| Decision tree | 84.5 | 81.4 | 74.6 | 72.5 | 86.2 | 84.5 | 84.6 | 78.2 |
| BP | 85.6 | 82.2 | 74.8 | 73.2 | 87.6 | 85.4 | 83.4 | 81.5 |
| KNN | 90.6 | 86.5 | 78.6 | 76.8 | 91.4 | 87.1 | 84.2 | 82.8 |
| LDA | 89.4 | 86.2 | 80.5 | 76.6 | 90.8 | 86.0 | 86.4 | 82.2 |
| SVM | 91.7 | 86.8 | 78.0 | 76.3 | 95.0 | 86.1 | 90.5 | 78.3 |
| PSO-SVM | 91.3 | 88.1 | 83.5 | 80.1 | 95.2 | 89.7 | 92.0 | 83.1 |
Figure 8The classification accuracy of PSO-SVM compared with traditional methods for 2008 Dataset 1.