| Literature DB >> 30682797 |
Daniel Ramírez-Martínez1, Mariel Alfaro-Ponce2, Oleksiy Pogrebnyak3, Mario Aldape-Pérez4, Amadeo-José Argüelles-Cruz5.
Abstract
Classification of electromyographic signals has a wide range of applications, from clinical diagnosis of different muscular diseases to biomedical engineering, where their use as input for the control of prosthetic devices has become a hot topic of research. The challenge of classifying these signals relies on the accuracy of the proposed algorithm and the possibility of its implementation in hardware. This paper considers the problem of electromyography signal classification, solved with the proposed signal processing and feature extraction stages, with the focus lying on the signal model and time domain characteristics for better classification accuracy. The proposal considers a simple preprocessing technique that produces signals suitable for feature extraction and the Burg reflection coefficients to form learning and classification patterns. These coefficients yield a competitive classification rate compared to the time domain features used. Sometimes, the feature extraction from electromyographic signals has shown that the procedure can omit less useful traits for machine learning models. Using feature selection algorithms provides a higher classification performance with as few traits as possible. The algorithms achieved a high classification rate up to 100% with low pattern dimensionality, with other kinds of uncorrelated attributes for hand movement identification.Entities:
Keywords: classification algorithms; electromyography; feature selection; hand movement; health monitoring; machine learning; maximum entropy reflection coefficients
Year: 2019 PMID: 30682797 PMCID: PMC6387220 DOI: 10.3390/s19030475
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Time-domain and frequency-domain features used in sEMG data processing and classification tasks.
| Feature | Abbreviation | |
|---|---|---|
| 1 | Root mean squared value | RMS |
| 2 | Mean average value | MAV |
| 3 | Variance | VAR |
| 4 | Willison amplitude | WAMP |
| 5 | Wavelength | WL |
| 6 | Auto-regressive | AR |
| 7 | Difference absolute mean value | DAMV |
| 8 | Difference absolute standard deviation value | DASDV |
| 9 | Difference absolute variance | DVARV |
| 10 | Difference absolute standard deviation | DASDV |
| 11 | Second order moment | M2 |
| 12 | Integrated EMG | IEMG |
| 13 | Simple squared integration | SSI |
| 14 | Myopulse percentage rate | MYOP |
| 15 | Cepstral coefficients | CC |
| 16 | Log detector | LOG |
| 17 | Temporal moment | TK |
| 18 | V order | V |
| 19 | Zero crossings | ZC |
| 20 | Slope sign change | SSC |
Feature extraction techniques for hand movement classification applied to the EMG dataset from the University of California at Irvine (UCI) machine learning repository.
| Research Group | Algorithm | Accuracy |
|---|---|---|
| [ | Neural Network after Empirical Mode Decomposition (EDM) | 85% |
| [ | Adaptive Boosting after EMD | 55% |
| [ | Linear Discriminant Analysis after EMD | 65% |
| [ | Random Forest after EMD | 91% |
| [ | Random Forest + PCA after EMD | 94% |
| [ | Singular-Value Decomposition with SVM | 98.22% |
| [ | k-Nearest Neighbor | 94.77% |
| [ | Naive Bayes | 91.66% |
| [ | Radial Basis Function Network | 94% |
Figure 1On top, the original signal; on the bottom, the clipped signal with a zero mean value.
Figure 2Dataset building block diagram.
Figure 3Lattice filter prediction cascade diagram.
Feature reduction process. MAV, mean average value; SSI, simple squared integration; WL, wavelength; WAMP, Willison amplitude; MYOP, myopulse percentage rate.
| l | r | Remaining Features |
|---|---|---|
| 29 | 1 | [MAV, SSI, VAR, RMS, WL, WAMP, SSC, ZC, MYOP, Arb |
| 28 | 2 | [SSI, VAR, RMS, WL, WAMP, SSC, ZC, MYOP, Arb |
| 27 | 3 | [VAR, RMS, WL, WAMP, SSC, ZC, MYOP, Arb |
| ⋮ | ⋮ | ⋮ |
| 10 | 20 | [ZC, MYOP, Arb |
Classification results of the time domain (TD), Arb, and K datasets separately. P3, third order polynomial.
| N | Dataset | Bayes | IBk | MLP | Tree | SVM | |||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| − | − |
|
|
|
| − |
|
|
|
|
|
| 20 | TD |
|
|
|
|
|
|
|
|
|
|
| 20 | Arb |
|
|
|
|
|
|
|
|
|
|
| 20 | k |
|
|
|
|
|
|
|
|
|
|
Figure 4Accuracy rate achieved by the different classifiers.
Figure 5WAUC achieved by the different classifiers.
Figure 6Sensitivity achieved by the different classifiers.
Figure 7Specificity achieved by the different classifiers.
Classification performance of the combined datasets.
| N | Features | Bayes | IBk | MLP | Tree | SVM | |||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| − | − |
|
|
|
| − |
|
|
|
|
|
| 40 |
|
|
|
|
|
|
|
|
|
|
|
| 40 |
|
|
|
|
|
|
|
|
|
|
|
| 60 |
|
|
|
|
|
|
|
|
|
|
|
Classification performances using different feature vectors after feature selection: SE = [Arb(1,2,5,9,10), k(1,2,3,4,9), WL, SSC, ZC, MYOP, Arb(1,2,4,5,10), (1,5,7), WL, MYOP], = [Arb(1,2,7,8,10), K(1,2,10), ZCC, MYOP], = [Arb1, K1, ZCC, RMS], = [, , ZCC, MAV], = [, , MYOP, RMS] = [, , MYOP, MAV].
| N | Features | Bayes | IBk | MLP | Tree | SVM | |||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| − | − |
|
|
|
| − |
|
|
|
|
|
| 26 |
|
|
|
|
|
|
|
|
|
|
|
| 26 |
|
|
|
|
|
|
|
|
|
|
|
| 20 |
|
|
|
|
|
|
|
|
|
|
|
| 8 |
|
|
|
|
|
|
|
|
|
|
|
| 8 |
|
|
|
|
|
|
|
|
|
|
|
| 8 |
|
|
|
|
|
|
|
|
|
|
|
| 8 |
|
|
|
|
|
|
|
|
|
|
|