| Literature DB >> 31315261 |
Mohammad Iman Mokhlespour Esfahani1, Maury A Nussbaum2.
Abstract
Physical activities can have important impacts on human health. For example, a physically active lifestyle, which is one of the most important goals for overall health promotion, can diminish the risk for a range of physical disorders, as well as reducing health-related expenditures. Thus, a long-term goal is to detect different physical activities, and an important initial step toward this goal is the ability to classify such activities. A recent and promising technology to discriminate among diverse physical activities is the smart textile system (STS), which is becoming increasingly accepted as a low-cost activity monitoring tool for health promotion. Accordingly, our primary aim was to assess the feasibility and accuracy of using a novel STS to classify physical activities. Eleven participants completed a lab-based experiment to evaluate the accuracy of an STS that featured a smart undershirt (SUS) and commercially available smart socks (SSs) in discriminating several basic postures (sitting, standing, and lying down), as well as diverse activities requiring participants to walk and run at different speeds. We trained three classification methods-K-nearest neighbor, linear discriminant analysis, and artificial neural network-using data from each smart garment separately and in combination. Overall classification performance (global accuracy) was ~98%, which suggests that the STS was effective for discriminating diverse physical activities. We conclude that, overall, smart garments represent a promising area of research and a potential alternative for discriminating a range of physical activities, which can have positive implications for health promotion.Entities:
Keywords: classification; human health; physical activities; smart garment; smart shirt; smart socks; smart textile system; wearable sensor
Year: 2019 PMID: 31315261 PMCID: PMC6679301 DOI: 10.3390/s19143133
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Summary of participant characteristics (SD = standard deviation).
| Measure | Mean (SD) | Range |
|---|---|---|
| Age (years) | 21.3 (2.5) | 18–26 |
| Body mass (kg) | 76.2 (8.2) | 64.4–86 |
| Stature (cm) | 174.5 (7.4) | 163–186 |
| BMI (kg/m2) | 25.0 (2.6) | 22.4–29.4 |
Walking and running speeds used in the experiment for each participant (PX is an abbreviation for participant number, units are km/h, and SD is an abbreviation for standard deviation).
| P1 | P2 | P3 | P4 | P5 | P6 | P7 | P8 | P9 | P10 | P11 | Mean (SD) | ||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Walking | Slow | 2.1 | 1.9 | 2.0 | 2.1 | 2.1 | 2.1 | 2.0 | 2.2 | 1.9 | 2.2 | 2.3 | 2.08 (0.12) |
| Comfortable | 2.6 | 2.4 | 2.5 | 2.6 | 2.6 | 2.6 | 2.5 | 2.8 | 2.4 | 2.7 | 2.9 | 2.6 (0.15) | |
| Fast | 3.1 | 2.9 | 3.0 | 3.1 | 3.1 | 3.1 | 3.0 | 3.4 | 2.9 | 3.2 | 3.5 | 3.12 (0.19) | |
| Running | Comfortable | 5.6 | 4.6 | 5.7 | 5.5 | 5.5 | 5.8 | 6.0 | 4.2 | 5.4 | 6.3 | 5.1 | 5.43 (0.6) |
| Fast | 6.7 | 5.5 | 6.8 | 6.6 | 6.6 | 7.0 | 7.2 | 5.0 | 6.5 | 7.6 | 6.1 | 6.51 (0.74) | |
Figure 1Illustrations of a participant wearing two smart garments (smart shirt and smart socks) while: (a) A5: sitting, (b) A3: standing, (c) A2: lying down, (d) A6: walking, and (e) A9: running.
Global accuracy (percentage) for each classification method at the individual and group levels for the smart garments (SSs), smart undershirt (SUS), and smart textile system (STS).
| Model | Individual Level | Group Level | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| P1 | P2 | P3 | P4 | P5 | P6 | P7 | P8 | P9 | P10 | P11 | |||
| K-NN | SSs | 97 | 97 | 98 | 99 | 98 | 99 | 96 | 98 | 97 | 96 | 96 | 97 |
| SUS | 95 | 97 | 97 | 98 | 98 | 97 | 96 | 95 | 98 | 96 | 91 | 96 | |
| STS | 97 | 99 | 99 | 99 | 99 | 99 | 98 | 98 | 99 | 98 | 96 | 98 | |
| LDA | SSs | 69 | 89 | 90 | 91 | 83 | 92 | 72 | 81 | 90 | 75 | 83 | 15 |
| SUS | 87 | 91 | 89 | 92 | 96 | 94 | 88 | 84 | 96 | 90 | 82 | 42 | |
| STS | 94 | 96 | 97 | 97 | 99 | 97 | 92 | 96 | 98 | 96 | 93 | 47 | |
| ANN | SSs | 95 | 94 | 97 | 98 | 98 | 98 | 93 | 93 | 95 | 90 | 93 | 90 |
| SUS | 95 | 99 | 98 | 99 | 98 | 98 | 97 | 95 | 98 | 88 | 90 | 94 | |
| STS | 97 | 99 | 99 | 99 | 99 | 99 | 98 | 99 | 99 | 98 | 98 | 98 | |
K-NN: k = 10 in k-nearest neighbor, LDA: linear discriminant analysis, ANN: artificial neural network. P1–P11 are participant numbers.
F-scores obtained for each activity using different classification models at the group level.
| Model | A1 | A2 | A3 | A4 | A5 | A6 | A7 | A8 | A9 | A10 | A11 | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| K-NN | SSs | 0.96 | 0.99 | 0.99 | 0.97 | 0.99 | 0.94 | 0.93 | 0.95 | 0.98 | 0.99 | 0.97 |
| SUS | 0.92 | 0.99 | 0.99 | 0.9 | 0.99 | 0.93 | 0.93 | 0.95 | 0.96 | 0.96 | 0.97 | |
| STS | 0.96 | 0.99 | 0.99 | 0.95 | 0.99 | 0.95 | 0.95 | 0.97 | 0.98 | 0.98 | 0.98 | |
| LDA | SSs | 0 | 0.2 | 0.21 | 0 | 0.13 | 0.17 | 0.09 | 0.08 | 0.19 | 0.13 | 0.11 |
| SUS | 0.26 | 0.78 | 0.52 | 0.02 | 0.52 | 0.23 | 0.26 | 0.38 | 0.39 | 0.5 | 0.23 | |
| STS | 0.79 | 0.6 | 0.11 | 0.58 | 0.38 | 0.31 | 0.43 | 0.45 | 0.49 | 0.3 | 0.35 | |
| ANN | SSs | 0.78 | 0.97 | 0.92 | 0.84 | 0.97 | 0.82 | 0.83 | 0.84 | 0.96 | 0.96 | 0.88 |
| SUS | 0.88 | 0.99 | 0.99 | 0.81 | 0.99 | 0.90 | 0.90 | 0.92 | 0.94 | 0.94 | 0.95 | |
| STS | 0.97 | 0.99 | 0.99 | 0.97 | 0.99 | 0.96 | 0.96 | 0.97 | 0.99 | 0.99 | 0.98 | |
A1–A11 are activities.
Classification performance for each activity (A1–A11) using K-NN models at the group level.
| Sensitivity | Specificity | Precision | Accuracy | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| SS | SUS | CSTS | SS | SUS | CSTS | SS | SUS | CSTS | SS | SUS | CSTS | |
| A1 | 0.96 | 0.87 | 0.93 | 0.99 | 0.99 | 0.99 | 0.95 | 0.97 | 0.99 | 0.99 | 0.99 | 0.99 |
| A2 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 |
| A3 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 |
| A4 | 0.98 | 0.86 | 0.95 | 0.99 | 0.99 | 0.99 | 0.96 | 0.94 | 0.96 | 0.99 | 0.99 | 0.99 |
| A5 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.98 | 0.99 | 0.99 | 0.99 | 0.99 |
| A6 | 0.93 | 0.92 | 0.94 | 0.99 | 0.99 | 0.99 | 0.96 | 0.93 | 0.95 | 0.98 | 0.98 | 0.99 |
| A7 | 0.94 | 0.95 | 0.96 | 0.99 | 0.98 | 0.99 | 0.92 | 0.91 | 0.94 | 0.98 | 0.98 | 0.99 |
| A8 | 0.94 | 0.95 | 0.97 | 0.99 | 0.99 | 0.99 | 0.95 | 0.94 | 0.96 | 0.99 | 0.99 | 0.99 |
| A9 | 0.99 | 0.96 | 0.98 | 0.99 | 0.99 | 0.99 | 0.98 | 0.97 | 0.98 | 0.99 | 0.99 | 0.99 |
| A10 | 0.99 | 0.96 | 0.99 | 0.99 | 0.99 | 0.99 | 0.99 | 0.95 | 0.98 | 0.99 | 0.99 | 0.99 |
| A11 | 0.96 | 0.98 | 0.99 | 0.99 | 0.99 | 0.99 | 0.98 | 0.96 | 0.98 | 0.99 | 0.99 | 0.99 |
| Mean | 0.97 | 0.95 | 0.97 | 0.99 | 0.99 | 0.99 | 0.97 | 0.96 | 0.98 | 0.99 | 0.99 | 0.99 |
Figure 2Confusion matrix when using the K-NN method at the group level (i.e., all participants), using input from both SSs and SUS (i.e., the complete STS). Input and output classes correspond to the 11 simulated physical activities. Cells on the main diagonal (green color) and off-diagonal (red color) indicate the numbers of correctly and incorrectly classified observations of each activity, respectively. Cells in the right-hand column provide the percentages of precision (green font) and false discovery rate (red font) for each activity. Cells in the lowest row provide the percentages of both sensitivity (green font) and false negative rate (red font) for each activity. The cell at the bottom-right corner (gray color) provides global accuracy.
Figure 3Results of applying the Bayes accuracy method (left) to identify effective subsets of sensors, from the total of 11 sensors in the SUS (sensors are labeled in the figure on the right, using letters A-K).