| Literature DB >> 34068804 |
Chia-Yeh Hsieh1, Hsiang-Yun Huang1, Kai-Chun Liu2, Chien-Pin Liu1, Chia-Tai Chan1, Steen Jun-Ping Hsu3.
Abstract
Fall-related information can help clinical professionals make diagnoses and plan fall prevention strategies. The information includes various characteristics of different fall phases, such as falling time and landing responses. To provide the information of different phases, this pilot study proposes an automatic multiphase identification algorithm for phase-aware fall recording systems. Seven young adults are recruited to perform the fall experiment. One inertial sensor is worn on the waist to collect the data of body movement, and a total of 525 trials are collected. The proposed multiphase identification algorithm combines machine learning techniques and fragment modification algorithm to identify pre-fall, free-fall, impact, resting and recovery phases in a fall process. Five machine learning techniques, including support vector machine, k-nearest neighbor (kNN), naïve Bayesian, decision tree and adaptive boosting, are applied to identify five phases. Fragment modification algorithm uses the rules to detect the fragment whose results are different from the neighbors. The proposed multiphase identification algorithm using the kNN technique achieves the best performance in 82.17% sensitivity, 85.74% precision, 73.51% Jaccard coefficient, and 90.28% accuracy. The results show that the proposed algorithm has the potential to provide automatic fine-grained fall information for clinical measurement and assessment.Entities:
Keywords: fall recording system; multiphase identification; wearable inertial sensor
Mesh:
Year: 2021 PMID: 34068804 PMCID: PMC8126206 DOI: 10.3390/s21093302
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Diagram of an acceleration-based signal of a fall process.
Figure 2Functional diagram of proposed multiphase identification algorithm.
Types and directions of fall.
| No. | Type | Direction | Trial |
|---|---|---|---|
| 1 | Fall while standing | Forward, backward, right lateral, and left lateral | 84 |
| 2 | Fall while standing up | Forward, backward, right lateral, and left lateral | 84 |
| 3 | Fall while sitting down | Forward, backward, right lateral, and left lateral | 84 |
| 4 | Fall while stooping down | Forward, backward, right lateral, and left lateral | 84 |
| 5 | Fall while walking | Forward, backward, right lateral, and left lateral | 84 |
| 6 | Fall while jumping | Forward, backward, right lateral, and left lateral | 84 |
| 7 | Fall while walking backward | Backward | 21 |
Elapsed time of each fall type and direction (notation: mean ± standard deviation).
| Type | Direction | |||
|---|---|---|---|---|
| Forward (s) | Backward(s) | Right Lateral(s) | Left Lateral(s) | |
| Fall while standing | 14.89 ± 2.03 | 14.91 ± 2.53 | 15.58 ± 2.32 | 15.35 ± 1.95 |
| Fall while standing up | 16.66 ± 3.01 | 17.00 ± 2.58 | 17.07 ± 1.83 | 17.33 ± 1.67 |
| Fall while sitting down | 16.44 ± 1.88 | 15.91 ± 1.85 | 15.90 ± 1.85 | 15.69 ± 1.72 |
| Fall while stooping down | 17.68 ± 1.35 | 20.56 ± 2.73 | 18.61 ± 2.13 | 19.07 ± 1.82 |
| Fall while walking | 15.52 ± 1.97 | 16.46 ± 1.93 | 15.65 ± 1.73 | 15.81 ± 1.55 |
| Fall while jumping | 19.04 ± 2.23 | 19.64 ± 2.62 | 19.11 ± 2.41 | 19.02 ± 2.36 |
| Fall while walking backward | -- | 19.17 ± 1.80 | -- | -- |
Figure 3Diagram of the experimental environment setting.
Figure 4The sensor orientation, wearing position of a sensor, and the subject has worn protectors in the experiment. (a) Sensor orientation; (b) The sensor was worn on the waist (lower back); (c,d) The front and back view of the subject worn protectors, respectively.
The feature set for multiphase classifier.
| Feature Set, | Feature Description |
|---|---|
| f1 ~ f8 |
|
| f9 ~ f16 |
|
| f17 ~ f24 |
|
| f25 ~ f32 |
|
| f33 ~ f40 |
|
| f41 ~ f48 |
|
| f49 ~ f52 |
|
| f53 ~ f64 |
|
Figure 5Diagram of proposed fragment modification algorithm. An example to modify one (situation 1), two (situation 2), or three (situation 3) segments that are different from previous and following segments. These segments (misclassified segments) should be modified to the same with the previous or following segments.
Figure 6An example of a process in the multiphase identification. The fragment modified results were compared against the ground truth in terms of TN, TP, FP and FN.
The performance results of the multiphase identification algorithm using machine learning techniques versus window sizes (unit:%).
| Machine Learning Technique | Evaluation Measure | Window Size | |||||
|---|---|---|---|---|---|---|---|
| 8 Samples | 16 Samples | 24 Samples | 32 Samples | 40 Samples | Overall | ||
| AdaBoost | Sensitivity | 73.25 | 75.07 | 77.31 | 78.06 | 77.87 | 76.31 |
| Precision | 83.82 | 85.81 | 86.55 | 85.31 | 83.04 | 84.91 | |
| Jaccard coefficient | 65.23 | 67.35 | 69.73 | 70.26 | 69.83 | 68.48 | |
| Accuracy | 87.85 | 88.57 | 89.30 | 89.54 | 89.77 | 89.00 | |
| SVM | Sensitivity | 72.47 | 74.44 | 75.70 | 77.62 | 78.22 | 75.69 |
| Precision | 78.40 | 76.61 | 77.92 | 79.07 | 79.30 | 78.26 | |
| Jaccard coefficient | 61.81 | 61.44 | 63.05 | 64.91 | 65.37 | 63.32 | |
| Accuracy | 84.45 | 83.83 | 84.80 | 85.86 | 86.22 | 85.03 | |
| kNN | Sensitivity | 81.17 | 82.46 |
| 82.65 | 81.49 |
|
| Precision | 86.02 |
| 86.53 | 85.56 | 83.91 |
| |
| Jaccard coefficient | 72.33 | 73.83 |
| 74.05 | 72.86 |
| |
| Accuracy | 89.32 | 90.07 | 90.56 |
| 90.69 |
| |
| DT | Sensitivity | 81.84 | 82.42 | 82.84 | 81.61 | 80.92 | 81.93 |
| Precision | 83.57 | 83.33 | 83.51 | 82.19 | 81.28 | 82.78 | |
| Jaccard coefficient | 72.46 | 72.51 | 73.03 | 71.64 | 70.93 | 72.12 | |
| Accuracy | 89.75 | 89.56 | 89.86 | 89.36 | 89.26 | 89.56 | |
| NB | Sensitivity | 65.80 | 67.44 | 68.07 | 66.92 | 64.76 | 66.60 |
| Precision | 72.77 | 73.01 | 73.41 | 73.26 | 71.31 | 72.75 | |
| Jaccard coefficient | 53.55 | 54.93 | 55.64 | 55.56 | 54.36 | 54.81 | |
| Accuracy | 80.76 | 81.24 | 81.50 | 81.72 | 81.67 | 81.38 | |
| Overall | Sensitivity | 74.91 | 76.37 |
| 77.37 | 76.65 |
|
| Precision | 80.92 | 81.09 |
| 81.08 | 79.77 |
| |
| Jaccard coefficient | 65.08 | 66.01 | 67.19 |
| 66.67 |
| |
| Accuracy | 86.42 | 86.65 | 87.20 | 87.45 |
|
| |
Figure 7The average performance using different machine learning techniques.
Figure 8The average performance using different window sizes.
The performance results of each phase using the kNN technique with different window sizes (unit:%).
| Using a kNN technique with a window size of 8 samples (0.0625 s) | ||||||||
| Initial-static | Pre-fall | Free-fall | Impact | Resting | Recovery | Ending-static | Overall | |
| Sensitivity | 99.70 | 62.96 | 51.01 | 79.64 | 96.33 | 80.12 | 98.46 | 81.17 |
| Precision | 90.26 | 93.18 | 67.03 | 87.00 | 84.12 | 88.19 | 92.39 | 86.02 |
| Jaccard coefficient | 90.03 | 60.17 | 40.20 | 71.07 | 81.50 | 72.31 | 91.05 | 72.33 |
| Accuracy | 89.32 | |||||||
| Using a kNN technique with a window size of 16 samples (0.125 s) | ||||||||
| Sensitivity | 99.70 | 64.34 | 54.42 | 81.21 | 96.74 | 82.38 | 98.45 | 82.46 |
| Precision | 91.38 | 94.75 | 67.63 | 86.49 | 86.07 | 87.68 | 92.88 |
|
| Jaccard coefficient | 91.14 | 62.11 | 42.73 | 71.94 | 83.62 | 73.79 | 91.51 | 73.83 |
| Accuracy | 90.07 | |||||||
| Using a kNN technique with a window size of 24 samples (0.1875 s) | ||||||||
| Sensitivity | 99.54 | 66.02 | 54.89 | 82.07 | 96.98 | 83.80 | 98.33 |
|
| Precision | 92.50 | 95.73 | 64.50 | 85.12 | 87.31 | 87.52 | 93.03 | 86.53 |
| Jaccard coefficient | 92.12 | 64.10 | 42.08 | 71.68 | 84.97 | 74.81 | 91.55 |
|
| Accuracy | -- | -- | -- | -- | -- | -- | -- | 90.56 |
| Using a kNN technique with a window size of 32 samples (0.25 s) | ||||||||
| Sensitivity | 99.29 | 67.71 | 49.88 | 81.87 | 96.86 | 84.70 | 98.24 | 82.65 |
| Precision | 93.44 | 96.18 | 57.93 | 82.93 | 88.07 | 87.49 | 92.89 | 85.56 |
| Jaccard coefficient | 92.83 | 65.91 | 37.11 | 70.01 | 85.60 | 75.52 | 91.35 | 74.05 |
| Accuracy | -- | -- | -- | -- | -- | -- | -- |
|
| Using a kNN technique with a window size of 40 samples (0.3125 s) | ||||||||
| Sensitivity | 99.05 | 68.82 | 41.97 | 80.79 | 96.34 | 85.22 | 98.28 | 81.49 |
| Precision | 94.24 | 96.01 | 48.38 | 80.30 | 88.51 | 87.25 | 92.66 | 83.91 |
| Jaccard coefficient | 93.40 | 66.91 | 29.82 | 67.38 | 85.61 | 75.75 | 91.15 | 72.86 |
| Accuracy | -- | -- | -- | -- | -- | -- | -- | 90.69 |
Figure 9The average performance of proposed algorithm using the kNN technique with each window size.
A summary of previous studies on sensor types, techniques, and provided fall-related information.
| Article (Year) [Reference] | Sensor Type | Technique (Method) | Provided Fall-Related Information |
|---|---|---|---|
| Becker et al. (2012) [ | Inertial sensor | Manual labeling | Starting and ending points of fall phases |
| Robinovitch et al. (2013) [ | Camera | Manual labeling | Causes of falling; Activities before the fall event. |
| Hsieh et al. (2018) [ | Inertial sensor | Machine learning (SVM) | Fall directions (97.34% accuracy) |
| Hussain et al. (2019) [ | Inertial sensor | Machine learning (kNN, SVM and random forest) | Fall types (96.82% accuracy using random forest classifier) |
| Clemente et al. (2019) [ | Seismic sensor | Machine learning (SVM) | Fall positions (localization error is smaller than 0.28 m) |
| This study | Inertial sensor | Machine learning (SVM, kNN, NB, DT and AdaBoost) | Starting and ending points of fall phases; Duration of fall phases. |