| Literature DB >> 34025207 |
Gianluca Bonifazi1, Enrico Corradini1, Domenico Ursino1, Luca Virgili1, Emiliano Anceschi2, Massimo Callisto De Donato2.
Abstract
In the last few decades, we have witnessed an increasing focus on safety in the workplace. ICT has always played a leading role in this context. One ICT sector that is increasingly important in ensuring safety at work is the Internet of Things and, in particular, the new architectures referring to it, such as SIoT, MIoT and Sentient Multimedia Systems. All these architectures handle huge amounts of data to extract predictive and prescriptive information. For this purpose, they often make use of Machine Learning. In this paper, we propose a framework that uses both Sentient Multimedia Systems and Machine Learning to support safety in the workplace. After the general presentation of the framework, we describe its specialization to a particular case, i.e., fall detection. As for this application scenario, we describe a Machine Learning based wearable device for fall detection that we designed, built and tested. Moreover, we illustrate a safety coordination platform for monitoring the work environment, activating alarms in case of falls, and sending appropriate advices to help workers involved in falls.Entities:
Keywords: Decision trees; Fall detection; Industry 4.0; Internet of things; Machine learning; Safety at work; Sentient multimedia systems
Year: 2021 PMID: 34025207 PMCID: PMC8122213 DOI: 10.1007/s11042-021-10984-z
Source DB: PubMed Journal: Multimed Tools Appl ISSN: 1380-7501 Impact factor: 2.757
Fig. 1The overall architecture of the proposed framework
Fig. 2An overview of Personal Devices available for a worker
Structure and some example tuples of the merged dataset
| 14.529 | 67.413 | − 12.506 | 18,271 | − 955.762 | − 9.447 |
| 14.383 | 65.208 | − 12.375 | 14.776 | − 951.406 | − 4.152 |
| 14.310 | 65.671 | − 15.453 | 13.564 | − 950.841 | − 7.296 |
| 15.674 | 68.120 | − 13.910 | 19.656 | − 948.253 | − 4.601 |
| 14.814 | 68.475 | − 15.168 | 19.234 | − 949.437 | − 6.797 |
Feature definition
| Feature | Definition |
|---|---|
| Maximum Value |
|
| Minimum Value |
|
| Mean Value |
|
| Variance |
|
Fig. 3Correlation matrix between the features
Feature relevance in identifying the correct class of activities
| Feature | Relevance |
|---|---|
| y_acc_MEAN | 0.2435 |
| y_acc_MAX | 0.1877 |
| x_acc_MIN | 0.1004 |
| y_acc_MIN | 0.0545 |
| x_gyro_MEAN | 0.0504 |
| z_gyro_MEAN | 0.0357 |
| z_gyro_MIN | 0.0336 |
| y_gyro_VARIANCE | 0.0326 |
| y_acc_VARIANCE | 0.0298 |
| z_acc_VARIANCE | 0.0293 |
| x_acc_MAX | 0.0283 |
| x_gyro_VARIANCE | 0.0269 |
| z_acc_MIN | 0.0255 |
| z_gyro_VARIANCE | 0.0221 |
| y_gyro_MIN | 0.0175 |
| z_acc_MAX | 0.0138 |
| x_acc_MEAN | 0.0127 |
| z_gyro_MAX | 0.0103 |
| z_acc_MEAN | 0.0095 |
| x_acc_VARIANCE | 0.0095 |
| y_gyro_MAX | 0.0090 |
| x_gyro_MIN | 0.0081 |
| x_gyro_MAX | 0.0052 |
| y_gyro_MEAN | 0.0041 |
Fig. 4Activities labeled as Not Fall and Fall against the mean and the maximum accelerations on the Y axis
Accuracy, Sensitivity, Specificity values achieved by several classification algorithms when applied to our dataset (at left) and Worst Case Time Complexity of Training and Prediction (at right)
| Accuracy | Sensitivity | Specificity | Worst Case Time | Worst Case Time | |
|---|---|---|---|---|---|
| Complexity of Training | Complexity of Prediction | ||||
| Decision Tree - C4.5 | 0.9487 | 0.9391 | 0.9566 | ||
| Decision Tree - CART | 0.9128 | 0.8910 | 0.9223 | ||
| Multilayer Perceptron | 0.9270 | 0.8829 | 0.9363 |
|
|
| k-Nearest Neighbors (k = 3) | 0.8790 | 0.8747 | 0.9263 | ||
| Logistic Regression | 0.7707 | 0.8599 | 0.7057 | ||
| Linear Discriminant Analysis | 0.7557 | 0.4956 | 0.9663 | ||
| Gaussian Naive Bayes | 0.7175 | 0.4947 | 0.8989 | ||
| Support Vector Machine | 0.7141 | 0.4103 | 0.9486 |
Fig. 5SensorTile.box (STEVAL-
Fig. 6Workflow of the Machine Learning Core of LSM6DSOX
Adopted configuration of the MLC component
| Setting | |
|---|---|
| Input data | Three axis accelerometer and gyroscope |
| MLC output frequency | 12.5 Hz |
| Accelerometer sampling frequency | 12.5 Hz |
| Gyroscope sampling frequency | 12.5 Hz |
| Full scale accelerometer | ± 8 g |
| Full scale gyroscope | ± 2000 dps |
| Sample window | 37 samples |
| Filtering | Second-Order IIR filter with cutting frequency at 4 Hz |
Fig. 7Example of a workplace scenario (on the top) and description of how the verification of a fall and the transmission of the alarms occur (on the bottom)
A taxonomy for Not Fall Activities (on the left) and Fall Activities (on the right)
| Not fall activity | Fall activity |
|---|---|
| Walk slow (< 6 | Walk and fall forward after tripping |
| Walk fast (≥ 6 | Walk and fall sideways (right) after tripping |
| Run slow (< 8 | Walk and fall to the side (left) after tripping |
| Run fast (≥ 8 | Fake fainting and fall on the right while standing |
| Sit slowly in a chair | Fake fainting and fall forward while standing |
| Sit slowly on the ground | Fake fainting and fall on the left while standing |
| Sit abruptly in a chair | Run and fall forward after stumbling |
| Jump to reach an object located at the top | |
| Go up and down the stairs slowly (< 6 | |
| Go up and down the stairs quickly (≥ 6 | |
| Walk and stumble without falling down | |
| Jump forward from an elevated position | |
| Jump forward from the floor |
Confusion matrix for the output provided by our device
| (Real) Fall | (Real) Not Fall | |
|---|---|---|
| (Predicted) Fall | 1170 (TP) | 55 (FP) |
| (Predicted) Not Fall | 35 (FN) | 540 (TN) |
Fig. 8Modules of the safety coordination platform