| Literature DB >> 35654623 |
Abdul Wasay Sardar1, Farman Ullah2, Jamshid Bacha3, Jebran Khan4, Furqan Ali5, Sungchang Lee6.
Abstract
The development of smartphones technologies has determined the abundant and prevalent computation. An activity recognition system using mobile sensors enables continuous monitoring of human behavior and assisted living. This paper proposes the mobile sensors-based Epidemic Watch System (EWS) leveraging the AI models to recognize a new set of activities for effective social distance monitoring, probability of infection estimation, and COVID-19 spread prevention. The research focuses on user activities recognition and behavior concerning risks and effectiveness in the COVID-19 pandemic. The proposed EWS consists of a smartphone application for COVID-19 related activities sensors data collection, features extraction, classifying the activities, and providing alerts for spread presentation. We collect the novel dataset of COVID-19 associated activities such as hand washing, hand sanitizing, nose-eyes touching, and handshaking using the proposed EWS smartphone application. We evaluate several classifiers such as random forests, decision trees, support vector machine, and Long Short-Term Memory for the collected dataset and attain the highest overall classification accuracy of 97.33%. We provide the Contact Tracing of the COVID-19 infected person using GPS sensor data. The EWS activities monitoring, identification, and classification system examine the infection risk of another person from COVID-19 infected person. It determines some everyday activities between COVID-19 infected person and normal person, such as sitting together, standing together, or walking together to minimize the spread of pandemic diseases.Entities:
Keywords: Accelerometer; Activity classification; Activity recognition; COVID-19; Contact tracing; GPS; Gyroscope; Smartphone sensors
Mesh:
Year: 2022 PMID: 35654623 PMCID: PMC9137241 DOI: 10.1016/j.compbiomed.2022.105662
Source DB: PubMed Journal: Comput Biol Med ISSN: 0010-4825 Impact factor: 6.698
Fig. 1Overview of activities needs care, identification, and recognition for COVID-19 pandemic spread minimization.
Literature review for the activity recognition, monitoring and classification.
| Ref No. | Research paper proposal | Activities | Classification algorithms | Sensors |
|---|---|---|---|---|
| Activity recognition and evaluation using mobile phone sensors | Walking, standing and running | Naive Bayes algorithm, K-means clustering | Accelerometer | |
| Smartphone based human activity verifying and identification using ML and DL | Walking, standing, running, sitting, upstairs, downstairs, inactive, laying | SVM, DT, KNN, SMO, DBN, ANN, NB, CNN, RNN, | Accelerometer, Gyroscope | |
| Human Activity Identification Using Smartphones | Walking, upstairs, downstairs, sitting, standing and lying | DT, SVM, k-nearest neighbors (KNN), boosting, bagging, stacking | Accelerometer, Gyroscope | |
| Human physical activity identification using smartphone sensors | Walking, running, sitting, standing, upstairs, and downstairs | Decision trees, logistic regression, and multilayer neural networks | Accelerometer, Gyroscope and Gravity sensor | |
| In relation to Physical Activity Identification Using Smartphone Sensors | Walking, standing and running | Naive Bayes, Support vector machines, Neural Networks, Logistic Regression, K Nearest Neighbor, DT | Accelerometer, Gyroscope | |
| Activity Recognition using smart Phone Accelerometers | Walking, jogging, upstairs, downstairs, sitting, standing | Decision trees, k-Nearest Neighbor, Naïve Bayes, and Bayes Net classifiers | Accelerometer | |
| Human Activity Analysis and Recognition from Smartphones using Machine Learning Techniques | Walking upstairs, downstairs, sitting, standing, laying | Decision Tree (DT), Support Vector Machine (SVM), Random Forest (RF), and Artificial Neural Network (ANN) | Accelerometer, Gyroscope | |
| Human Activity Recognition using Smartphone | Walking, limping, jogging, upstairs, and downstairs | Quadratic, KNN, SVM, ANN | Accelerometer | |
| A New Collection ELM for Human Activity Recognition Using Smartphone Sensors | Walking, fast walking, upstairs, downstairs, and running | Gaussian random projection, ANN, SVM, ELM, RF, and deep long short-term memory | Accelerometer, Gyroscope | |
Fig. 2The Propose Architecture for mobile sensors based platform of Human Physical Activities Recognition for COVID-19 pandemic spread minimization.
Fig. 3Placement of Smartphone is (arm Position) for activity data collection and recognition.
Dataset (KAU-COVID19-AR-Dataset): Activities, No of samples.
| Activities | No of samples |
|---|---|
| Walking | 15866 |
| Handwashing | 6422 |
| Standing | 5665 |
| Sitting | 2869 |
| Hand Sanitizing | 2105 |
| Nose–Eyes Touching | 1770 |
| Hand Shake | 1404 |
| Drink water | 1397 |
Fig. 4Accelerometer and Gyroscope sensors data plot when two different person perform same activity (-axis: No of samples, -axis: frequency).
Fig. 5Accelerometer and Gyroscope Sensor response comparison by performing similar activity by different person (-axis: No of samples, -axis: frequency).
Fig. 6Accelerometer sensor response for all activities (-axis: No of samples, -axis: frequency).
Fig. 7Gyroscope sensor data plot for all activities (-axis: No of samples, -axis: frequency).
Fig. 8The Classification algorithms for mobile sensors based platform of Human Physical Activities Recognition for COVID-19 pandemic spread minimization.
Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Sitting together in a two meters distance.
| Lat 1 | Long 1 | Lat 2 | Long 2 | Actual distance | Calculated distance |
|---|---|---|---|---|---|
| 37.6021337 | 126.8648657 | 37.6021786 | 126.8648785 | 2 | 5.11 |
| 37.6021337 | 126.8648657 | 37.6021786 | 126.8648785 | 2 | 5.11 |
| 37.6021337 | 126.8648657 | 37.6021754 | 126.864887 | 2 | 5.00 |
| 37.6021337 | 126.8648657 | 37.6021777 | 126.8648811 | 2 | 5.07 |
| 37.6021337 | 126.8648657 | 37.6021786 | 126.8648785 | 2 | 5.11 |
| 37.6021337 | 126.8648657 | 37.6021879 | 126.864881 | 2 | 6.17 |
| 37.6021337 | 126.8648657 | 37.6021904 | 126.864884 | 2 | 6.50 |
| 37.6021337 | 126.8648657 | 37.6021891 | 126.8648884 | 2 | 6.47 |
Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Standing together in a two meters distance.
| Lat 1 | Long 1 | Lat 2 | Long 2 | Actual distance | Calculated distance |
|---|---|---|---|---|---|
| 37.6021831 | 126.864904 | 37.6022178 | 126.8649444 | 2 | 5.24 |
| 37.6022023 | 126.8649205 | 37.602222 | 126.8649763 | 2 | 5.38 |
| 37.6021998 | 126.8649209 | 37.602222 | 126.8649764 | 2 | 5.47 |
| 37.6021999 | 126.8649208 | 37.6022221 | 126.8649764 | 2 | 5.48 |
| 37.6021986 | 126.8649135 | 37.602222 | 126.8649763 | 2 | 6.11 |
| 37.6022004 | 126.8649057 | 37.6022218 | 126.8649761 | 2 | 6.64 |
| 37.6021985 | 126.8649056 | 37.6022219 | 126.8649761 | 2 | 6.73 |
| 37.6021983 | 126.8649056 | 37.6022219 | 126.8649761 | 2 | 6.74 |
Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Walking together two meters apart.
| Lat 1 | Long 1 | Lat 2 | Long 2 | Actual distance | Calculated distance |
|---|---|---|---|---|---|
| 37.6027312 | 126.8650455 | 37.6027326 | 126.8650581 | 1 | 1.12 |
| 37.6025896 | 126.8649641 | 37.6025962 | 126.8649529 | 1 | 1.22 |
| 37.6025587 | 126.8649672 | 37.6025551 | 126.8649446 | 2 | 2.03 |
| 37.6025005 | 126.8649663 | 37.6025105 | 126.864943 | 2 | 2.33 |
| 37.6024674 | 126.8649676 | 37.6024692 | 126.8649401 | 2 | 2.43 |
| 37.6027296 | 126.865026 | 37.6027326 | 126.8650581 | 2 | 2.84 |
| 37.60261 | 126.8649718 | 37.6026387 | 126.8649694 | 2 | 3.19 |
| 37.602714 | 126.8650441 | 37.6026872 | 126.8650625 | 2 | 3.39 |
Fig. 9Developed Mobile Application for the sensors data collection and contact tracing.
Fig. 10Long Short-Term Memory (LSTM) training loss, training accuracy, validation loss and validation accuracy.
Fig. 11Long Short-Term Memory (LSTM) Confusion Matrix.
Fig. 12Random Forest Confusion Matrix (90% training, 10% testing).
Fig. 13Decision Tree Confusion Matrix (90% training, 10% testing).
Fig. 14K-Nearest Neighbors (KNN) Confusion Matrix (90% training, 10% testing).
Fig. 15Support Vector Machine (SVM) Confusion Matrix (90% training, 10% testing).
Performance summary of each algorithm for the KAU-COVID19-AR-Dataset (TR stands for Training & TE stands for Testing rate).
| Sr. No | Model | Accuracy (%) | Mean absolute error | Coefficient of determination | ||||||
|---|---|---|---|---|---|---|---|---|---|---|
| TR | TR | TR | TR | TR | TR | TR | TR | TR | ||
| 1 | Long Short-Term Memory (LSTM) | 91.25 | 91.33 | 92.90 | 0.031 | 0.030 | 0.028 | 0.680 | 0.678 | 0.699 |
| 2 | ||||||||||
| 3 | Decision tree | 89.33 | 90.66 | 92.66 | 0.207 | 0.160 | 0.204 | 0.860 | 0.913 | 0.826 |
| 4 | K-Nearest Neighbors (KNN) | 93.33 | 94.22 | 96.00 | 0.163 | 0.149 | 0.073 | 0.829 | 0.842 | 0.952 |
| 5 | Support Vector Machine (SVM) | 93.33 | 94.00 | 94.66 | 0.162 | 0.153 | 0.127 | 0.842 | 0.843 | 0.866 |
| 6 | Random Forest Regression | 86.00 | 84.00 | 82.66 | 0.378 | 0.400 | 0.387 | 0.660 | 0.659 | 0.739 |
| 7 | Decision Tree Regression | 88.88 | 87.00 | 88.00 | 0.267 | 0.307 | 0.287 | 0.711 | 0.727 | 0.760 |
| Set ip units, lstm units, op units and optimizer to define LSTM Network (L) |
| Normalize the dataset (Di) into values from 0 to 1 |
| Select training window size (tw) and organize Di accordingly |
| Train the Network (L) |
| Run Predictions using |
| Calculate the loss function |