| Literature DB >> 33202738 |
Luca Palmerini1,2, Jochen Klenk3,4,5, Clemens Becker3, Lorenzo Chiari1,2.
Abstract
Falling is a significant health problem. Fall detection, to alert for medical attention, has been gaining increasing attention. Still, most of the existing studies use falls simulated in a laboratory environment to test the obtained performance. We analyzed the acceleration signals recorded by an inertial sensor on the lower back during 143 real-world falls (the most extensive collection to date) from the FARSEEING repository. Such data were obtained from continuous real-world monitoring of subjects with a moderate-to-high risk of falling. We designed and tested fall detection algorithms using features inspired by a multiphase fall model and a machine learning approach. The obtained results suggest that algorithms can learn effectively from features extracted from a multiphase fall model, consistently overperforming more conventional features. The most promising method (support vector machines and features from the multiphase fall model) obtained a sensitivity higher than 80%, a false alarm rate per hour of 0.56, and an F-measure of 64.6%. The reported results and methodologies represent an advancement of knowledge on real-world fall detection and suggest useful metrics for characterizing fall detection systems for real-world use.Entities:
Keywords: accelerometer; fall detection; machine learning; smartphone; wearable
Mesh:
Year: 2020 PMID: 33202738 PMCID: PMC7697900 DOI: 10.3390/s20226479
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1The windowing procedure is applied to a representative acceleration norm signal. The dashed line represents the 1.4 g threshold. Three consecutive windows are shown and analyzed. The first window is not selected, since no peaks are detected in the peak “search” interval. Windows #2 and #3 are instead two possible candidate fall windows (CFWs), and are then passed to the next steps for feature extraction and classification.
Extracted features.
| Feature | Description | Phase |
|---|---|---|
| Lower peak value (LPV) | Minimum value of Normacc before the identified peak. | Pre-peak |
| Upper peak value (UPV) | Value of the identified peak of Normacc. It is, by construction, the value of the signal 1 s after the start of the window. | Peak sample |
| Wavelet-based coefficient | A mother wavelet was defined as the average of the fall signals in the training set. A similarity coefficient to such mother wavelet was calculated following the procedure described in [ | Impact (pre-peak + post-peak) |
| Periodicity after impact | This feature is based on the autocorrelation of Normacc. The rationale is that in the interval right after a fall, there cannot be a periodic movement, such as walking or running. It is computed in a segment of 2 s, starting 0.5 s after the peak sample. | Post-peak and post-impact |
| Standard deviation after impact | Standard deviation (SD) of Normacc during the post-impact phase. The rationale is that in the interval following the impact, the subject might produce limited movements compared to normal ADLs. | Post-impact |
Figure 2The stratification procedure for the five-fold cross-validation. The suffix “ext” for folds F reflects the fact that this cross-validation for performance evaluation was, in the case of the support vector machine (SVM) classifier, external with respect to internal cross-validation for model parameter selection (see Figure 3).
Figure 3The training and testing procedure for classifiers.
Classification results.
| Classifier | Features | AUC | Sensitivity (Recall) | Specificity [%] | False Alarm Rate | Positive Predictive Value (Precision) | F-Measure |
|---|---|---|---|---|---|---|---|
| Naïve Bayes | MultiPhase | 0.996 | 88.1 | 99.1 | 1.09 | 39 | 54.1 |
| Logistic Regression | MultiPhase | 0.996 | 83.2 | 99.3 | 0.76 | 46.6 | 59.8 |
| KNN | MultiPhase | 0.958 | 83.9 | 99.2 | 0.92 | 42.1 | 56.1 |
| Support Vector Machines | MultiPhase | 0.993 | 81.1 | 99.5 | 0.56 | 53.7 | 64.6 |
| Random Forests | MultiPhase | 0.989 | 83.2 | 98.9 | 1.32 | 33.3 | 47.6 |
| Naïve Bayes | Conventional | 0.977 | 95.1 | 95.5 | 5.23 | 12.6 | 22.3 |
| Logistic Regression | Conventional | 0.987 | 84.6 | 98.5 | 1.7 | 28.3 | 42.5 |
| KNN | Conventional | 0.959 | 85.3 | 98.8 | 1.42 | 32.4 | 46.9 |
| Support Vector Machines | Conventional | 0.986 | 83.9 | 98.6 | 1.61 | 29.3 | 43.4 |
| Random Forests | Conventional | 0.985 | 88.8 | 98.6 | 1.57 | 31 | 45.9 |
| Threshold-based | Kangas et al. | / | 30.1 | 99.3 | 0.82 | 22.9 | 26.3 |
Additional combinations of sensitivity and false alarm rates.
| SVM with Multiphase Features | |||||
|---|---|---|---|---|---|
| Sensitivity [%] | 53.8 | 64.3 | 81.1 | 93.7 | 95.1 |
| FA rate [FA/h] | 0.06 | 0.11 | 0.56 | 2.78 | 5.56 |
Five thresholds are chosen so that false alarm (FA) rates are, from the left: one-tenth and one-fifth of the one reported in Table 2, the same as the reported one, and five and ten times the reported one.
Computational time for a single window.
| Classifier | Features | Data Acquisition | Feature Extraction | Classification |
|---|---|---|---|---|
| Naïve Bayes | MultiPhase | 0.145 (0.07) | 0.104 (0.05) | 0.695 (0.15) |
| Logistic Regression | MultiPhase | 0.145 (0.07) | 0.104 (0.05) | 0.03 (0.01) |
| KNN | MultiPhase | 0.145 (0.07) | 0.104 (0.05) | 1.265 (0.23) |
| Support Vector Machines | MultiPhase | 0.145 (0.07) | 0.104 (0.05) | 0.452 (0.14) |
| Random Forests | MultiPhase | 0.145 (0.07) | 0.104 (0.05) | 17.63 (5.1) |
| Naïve Bayes | Conventional | 0.007 (0.01) | 0.049 (0.05) | 0.686 (0.44) |
| Logistic Regression | Conventional | 0.007 (0.01) | 0.049 (0.05) | 0.065 (0.05) |
| KNN | Conventional | 0.007 (0.01) | 0.049 (0.05) | 1.176 (0.28) |
| Support Vector Machines | Conventional | 0.007 (0.01) | 0.049 (0.05) | 0.368 (0.26) |
| Random Forests | Conventional | 0.007 (0.01) | 0.049 (0.05) | 16.166 (31.06) |
| Threshold-based | Kangas et al. | 0.008 (0.03) | 0.54 (0.15) | 0.002 (0.02) |