| Literature DB >> 35260746 |
Leticia Avellar1, Carlos Stefano Filho2, Gabriel Delgado3, Anselmo Frizera4, Eduardo Rocon3, Arnaldo Leal-Junior4.
Abstract
Smart textiles are novel solutions for remote healthcare monitoring which involve non-invasive sensors-integrated clothing. Polymer optical fiber (POF) sensors have attractive features for smart textile technology, and combined with Artificial Intelligence (AI) algorithms increase the potential of intelligent decision-making. This paper presents the development of a fully portable photonic smart garment with 30 multiplexed POF sensors combined with AI algorithms to evaluate the system ability on the activity classification of multiple subjects. Six daily activities are evaluated: standing, sitting, squatting, up-and-down arms, walking and running. A k-nearest neighbors classifier is employed and results from 10 trials of all volunteers presented an accuracy of 94.00 (0.14)%. To achieve an optimal amount of sensors, the principal component analysis is used for one volunteer and results showed an accuracy of 98.14 (0.31)% using 10 sensors, 1.82% lower than using 30 sensors. Cadence and breathing rate were estimated and compared to the data from an inertial measurement unit located on the garment back and the highest error was 2.22%. Shoulder flexion/extension was also evaluated. The proposed approach presented feasibility for activity recognition and movement-related parameters extraction, leading to a system fully optimized, including the number of sensors and wireless communication, for Healthcare 4.0.Entities:
Mesh:
Year: 2022 PMID: 35260746 PMCID: PMC8904460 DOI: 10.1038/s41598-022-08048-9
Source DB: PubMed Journal: Sci Rep ISSN: 2045-2322 Impact factor: 4.379
Figure 1Photonic Smart Garment overview and the sensor response when a force is applied on the top of the sensor 10.
Figure 2Response of sensors 27-30 when a predefined loading is applied to each sensor.
Figure 3Clustering of six classes (activities) using the response of 3 sensors.
Classification results for each volunteer.
| Volunteer | 1 | 2 | 3 | 4 |
|---|---|---|---|---|
| Accuracy (%) | 99.96 (0.04) | 92.04 (0.45) | 91.34 (0.36) | 94.86 (0.25) |
| 100 (0.00) | 98.31 (0.36) | 99.50 (0.18) | 99.91 (0.03) | |
| 100 (0.00) | 99.56 (0.19) | 100 (0.00) | 100 (0.00) | |
| 100 (0.00) | 89.69 (0.77) | 94.25 (0.62) | 89.77 (1.17) | |
| 100 (0.00) | 89.93 (1.40) | 99.98 (0.04) | 92.98 (0.75) | |
| 99.87 (0.00) | 93.94 (0.69) | 82.62 (1.98) | 94.50 (0.66) | |
| 99.87 (0.00) | 78.69 (1.73) | 64.91 (1.61) | 91.46 (1.22) | |
| 100 (0.00) | 96.02 (0.71) | 94.92 (0.60) | 98.34 (0.46) | |
| 100 (0.00) | 96.56 (0.66) | 100 (0.00) | 100 (0.00) | |
| 100 (0.00) | 92.08 (1.01) | 98.97 (0.20) | 91.56 (1.04) | |
| 100 (0.00) | 91.67 (0.63) | 100 (0.00) | 93.48 (0.84) | |
| 99.87 (0.00) | 82.50 (1.21) | 69.48 (1.15) | 91.77 (1.03) | |
| 99.87 (0.00) | 92.44 (0.72) | 79.53 (1.91) | 93.19 (0.44) | |
Figure 4Confusion matrix regarding 10 trials including the dataset of all volunteers in the kNN classification.
Figure 5Sensors activation for each activity. (a) Optical power variation between the standing and sitting activities of sensors 1, 30, 21 and 10. (b) Response of sensors 26 and 21 during the squatting activity and response of sensors 5 and 26 during the up-and-down arms activity. (c) Sensors activation during the walking and running activities.
Figure 6Mechanical analysis of different sensors during body moving in the different dynamic activities. (a) Squatting. (b) Up-and-down arms. (c) Walking. (d) Running.
Comparison between the classification results for volunteer 1 with complete structure and optimized structure.
| Volunteer | Complete structure | Optimized structure |
|---|---|---|
| Accuracy (%) | 99.96 (0.04) | 99.05 (0.10) |
| 100 (0.00) | 99.46 (0.21) | |
| 100 (0.00) | 100 (0.00) | |
| 100 (0.00) | 97.92 (0.51) | |
| 100 (0.00) | 98.43 (0.30) | |
| 99.87 (0.00) | 99.38 (0.19) | |
| 99.87 (0.00) | 99.08 (0.24) | |
| 100 (0.00) | 99.64 (0.22) | |
| 100 (0.00) | 100 (0.00) | |
| 100 (0.00) | 97.85 (0.26) | |
| 100 (0.00) | 98.32 (0.44) | |
| 99.87 (0.00) | 99.09 (0.24) | |
| 99.87 (0.00) | 99.37 (0.20) | |
Figure 7Results of the confusion matrices for volunteer 1 during activities. (a) Complete structure. (b) Optimized structure.
Figure 8Results of walking and running tests of the volunteer 1 for cadence estimation: temporal response and FFT of the IMU data (yaw) and the response of sensor 8.
Figure 9FFT of the IMU data (pitch) and the response of sensor 17 during standing activity of the volunteer 1 for breathing rate estimation.
Estimated parameters from IMU (reference) and Photonic Smart Garment sensors: errors between the measurements obtained from the two systems.
| Volunteer | IMU | Smart Garment | Errors | ||||||
|---|---|---|---|---|---|---|---|---|---|
| Cadence (steps/min) | BR (cycles/min) | Cadence (steps/min) | BR (cycles/min) | Cadence (%) | BR (%) | ||||
| Walking | Running | Standing | Walking | Running | Standing | Walking | Running | Standing | |
| 1 | 71.96 | 143.94 | 13.19 | 71.96 | 143.94 | 13.20 | 0 | 0 | 0.08 |
| 2 | 79.14 | 155.92 | 13.79 | 77.38 | 156.54 | 13.9 | 2.22 | 0.40 | 0.80 |
| 3 | 68.36 | 125.94 | 13.79 | 68.60 | 126.64 | 14.04 | 0.35 | 0.56 | 1.81 |
| 4 | 73.16 | 146.34 | 14.39 | 72.12 | 145.98 | 14.6 | 1.42 | 0.27 | 1.46 |
Figure 10Results of the up-and-down arms test of the volunteer 1: temporal response of sensors 5 (right arm) and 26 (left arm) and identification of the shoulder flexion and extension by outliers detection using the temporal response derivative.
Figure 11Sensors fabrication process. (a) Removal of part of the fiber material creating a lateral section. (b) LED coupling to the fiber lateral section in a 3D printed part. (c) Sensor encapsulation using clear urethane rubber mixture. (d) Encapsulated sensor. (e) Sensor incorporated in the garment.