Literature DB >> 35654623

Mobile sensors based platform of Human Physical Activities Recognition for COVID-19 spread minimization.

Abdul Wasay Sardar1, Farman Ullah2, Jamshid Bacha3, Jebran Khan4, Furqan Ali5, Sungchang Lee6.   

Abstract

The development of smartphones technologies has determined the abundant and prevalent computation. An activity recognition system using mobile sensors enables continuous monitoring of human behavior and assisted living. This paper proposes the mobile sensors-based Epidemic Watch System (EWS) leveraging the AI models to recognize a new set of activities for effective social distance monitoring, probability of infection estimation, and COVID-19 spread prevention. The research focuses on user activities recognition and behavior concerning risks and effectiveness in the COVID-19 pandemic. The proposed EWS consists of a smartphone application for COVID-19 related activities sensors data collection, features extraction, classifying the activities, and providing alerts for spread presentation. We collect the novel dataset of COVID-19 associated activities such as hand washing, hand sanitizing, nose-eyes touching, and handshaking using the proposed EWS smartphone application. We evaluate several classifiers such as random forests, decision trees, support vector machine, and Long Short-Term Memory for the collected dataset and attain the highest overall classification accuracy of 97.33%. We provide the Contact Tracing of the COVID-19 infected person using GPS sensor data. The EWS activities monitoring, identification, and classification system examine the infection risk of another person from COVID-19 infected person. It determines some everyday activities between COVID-19 infected person and normal person, such as sitting together, standing together, or walking together to minimize the spread of pandemic diseases.
Copyright © 2022 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Accelerometer; Activity classification; Activity recognition; COVID-19; Contact tracing; GPS; Gyroscope; Smartphone sensors

Mesh:

Year:  2022        PMID: 35654623      PMCID: PMC9137241          DOI: 10.1016/j.compbiomed.2022.105662

Source DB:  PubMed          Journal:  Comput Biol Med        ISSN: 0010-4825            Impact factor:   6.698


Introduction

The disease termed as the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) started at the end of 2019 and became the source of a virus known as Corona Virus (COVID-19). The World Health Organization (WHO) announced the disease as a pandemic in March 2020 [1], [2]. The updates released by the global healthcare agencies and state governments, the pandemic affected millions globally. With the rise of COVID-19, mobile health applications have become essential in contact tracing, information spreading, and pandemic management in general [3]. The World Health Organization [4], [5] defined the standards for preventing the spread of COVID-19: (i) keep the social distance of at least 1 m apart from each other, (ii) wash and sanitize the hands, (iii) isolation and direct interaction with the infectious excretions of a COVID-19. An individual who has had face-to-face contact with a COVID-19 case within 6 ft and an individual in a confined environment (such as in a classroom, meeting room, hospital waiting area) with a COVID-19 case at the distance of 2 m [6]. To reduce the COVID-19 spread, we have to identify activities that increase the pandemic spread [7]. There are two kinds of activities, group activities and individual personal activities. Group Activities prohibited during the COVID-19 pandemic include handshaking, two people sitting together (less than 6 ft distance), two people standing together (less than 6 ft distance), travailing in public transport, social distancing in a waiting area, and two or more people walking together (less than 6 ft distance). Personal activities include touching one’s nose, mouth, or eyes, sanitizing, hand washing, sitting, standing, walking, handshaking, and drinking water. In this paper, we propose the detection and recognition of activities that are not allowed and activities that need to be done frequently to minimize the spread of COVID-19. Human activity recognition is the method of identifying individual or group-specific activities using mobile and wearable sensors [8]. Smartphones have been utilized for activity identification because they are equipped with various sensors helpful for activity recognition, such as motion (Accelerometer and Gyroscope) and location sensors (GPS). In addition to smartphone sensors, the accelerometer has gotten the most consideration in activity recognition research. However, in recent years, other sensors, like the gyroscope and magnetometer, have been linked with an accelerometer to enhance activity recognition performance [9], [10], [11]. Mobile sensors are beneficial as they are economical, use less power, have high ability, and are more autonomous of the environment. Also, every person is equipped with a mobile nowadays. Hence, the massive incorporation of mobile sensors in daily life is the cause for the increased concern in mobile sensor-based activity recognition, with several studies having been committed to the scrutiny of the appropriateness of mobile sensors for recognizing human activities [12]. Recognition of human activities based on mobile device sensors has been usually identified as a multivariate time series classification problem. Automated feature extraction is potential within a deep-learning background via the creation of a deep model with several layers [13]. Human activity recognition contains various stages. It begins with time-series data pre-processing and segmentation, data feature extraction, and a relevant algorithm for classification. The main contributions and highlights of this paper are as follows: Overview of activities needs care, identification, and recognition for COVID-19 pandemic spread minimization. Activities recognition, monitoring, and classification are helpful in minimizing the spread of COVID-19. Mostly, the previous literature are focused on normal person or old age activities such as walking, sitting, standing, upstairs, and downstairs. This paper focuses on those activities that help minimize the spread of COVID-19 these activities include novel activities e.g. hand washing, handshaking, hand sanitizing, and nose eye touching. The overview of activities recognized in this paper in order to minimize the spread are shown in Fig. 1. The main focus of this article is about COVID-19 related activities recognition. However, after the activities classification, with the help of GPS sensor data, we detect if a person is interacting with another person less than 2 m away. Contact tracing is an essential factor for minimizing the spread of COVID-19 [14]. If two people are sitting together, standing together, or walking together less than two meters away and one of them is COVID-19 affected after some time. With the help of GPS sensor data, first, we calculate the distance between the two people and then check the activity between them. So, we calculate the possibility of one person infecting other, e.g., from just sitting/standing/walking together closer than 2 m apart, handshaking, or touching the nose–eyes.
Fig. 1

Overview of activities needs care, identification, and recognition for COVID-19 pandemic spread minimization.

The following are the main challenges in the proposed platform: Literature review for the activity recognition, monitoring and classification. Understanding and compilation of various physical activities data for the minimization of epidemic disease (COVID-19) spread using smartphone sensors. Interpretation of data acquired from smartphone sensors in the form of feature vectors and physical activity classification using these features. Implementation of a real-time structure to evaluate and recognize users’ physical activities e.g. hand washing, hand sanitizing, touching one’s nose or eyes etc. To answer the above challenges, the main contribution of the paper are: We develop a mobile application using the Flutter development platform. The mobile application contains user registration/login, sensor data acquisition for accelerometer, gyroscope, speed, GPS sensors, and alert messages for contact tracing. We collected a novel data set named “KAU-COVID19-AR-Dataset” for physical activities such as hand washing, hand sanitizing, and nose eye touching, using the developed mobile application. Features such as mean, median, variance, and standard deviation are extracted for evaluation of various classifiers. A comprehensive performance evaluation of a supervised learning-based classifier is then accomplished on the features data to select the one with the best identification rate. We implemented a novel real-time and smartphone-based framework that assembles the data from smartphone sensors and sends it to a server. The activity is identified and forwarded to the user based on the obtained data. We developed the GPS-based contract tracing at the top of the COVID-19 activities recognition. The rest of the paper is divided as follow: Section 2 briefly describe the related works and background about the sensors, activities recognized, and algorithms. The proposed methodology of the mobile application development and physical activities recognition related to COVID-19 is explained in Section 3. Section 4 introduces the contact tracing at the top of activity recognition. Section 5 shows the results and discussion on it, and finally we conclude the paper finding in Section 6.

Related work

Activity monitoring and recognition with wearable devices are the subject of interest for many researchers. Nowadays, the use of smartphones is widespread, and the smartphone built-in sensors (accelerometer, gyroscope, magnetometer, speed, GPS and Bluetooth) are helpful in activity recognition. In literature, accelerometer and gyroscope is widely used for activity recognition [15], [16], [17], [20]. The use of these sensors not only enable to detect, recognize, and classify normal activities, including sitting, standing, lying, walking, upstairs, and downstairs but also develop fitness tracking, health monitoring, fall detection, behavior-based context-awareness, and self-managing systems. Wearable sensors include an accelerometer, gyroscope, and heartbeat data transferred to mobile devices using Bluetooth. The smartphone app collects wearable sensors data using Bluetooth and data; smartphone sensors include an accelerometer, gyroscope, and magnetometer. This system detects activities including sitting, standing, getting up, lying down, bending, putting a hand back, stretching hand [24]. In a literature review, the researcher used eight different machine learning techniques applied to the smartphone sensor data for recognition and classification activities. The researcher used the following algorithms KNN, SVM, Random Forest, AdaBoost, Decision Tree, Logistic Regression, Naive Bayes, and Neural Network [25]. Currently, researchers do not work on COVID-19 prevention and monitoring human activity recognition, monitoring, and classification using smartphone sensors data. Related work for human activity recognition, monitoring, and classification in smartphone sensor data is shown in Table 1.
Table 1

Literature review for the activity recognition, monitoring and classification.

Ref No.Research paper proposalActivitiesClassification algorithmsSensors
[15]Activity recognition and evaluation using mobile phone sensorsWalking, standing and runningNaive Bayes algorithm, K-means clusteringAccelerometer

[16]Smartphone based human activity verifying and identification using ML and DLWalking, standing, running, sitting, upstairs, downstairs, inactive, layingSVM, DT, KNN, SMO, DBN, ANN, NB, CNN, RNN,Accelerometer, Gyroscope

[17]Human Activity Identification Using SmartphonesWalking, upstairs, downstairs, sitting, standing and lyingDT, SVM, k-nearest neighbors (KNN), boosting, bagging, stackingAccelerometer, Gyroscope

[18]Human physical activity identification using smartphone sensorsWalking, running, sitting, standing, upstairs, and downstairsDecision trees, logistic regression, and multilayer neural networksAccelerometer, Gyroscope and Gravity sensor

[19]In relation to Physical Activity Identification Using Smartphone SensorsWalking, standing and runningNaive Bayes, Support vector machines, Neural Networks, Logistic Regression, K Nearest Neighbor, DTAccelerometer, Gyroscope

[20]Activity Recognition using smart Phone AccelerometersWalking, jogging, upstairs, downstairs, sitting, standingDecision trees, k-Nearest Neighbor, Naïve Bayes, and Bayes Net classifiersAccelerometer

[21]Human Activity Analysis and Recognition from Smartphones using Machine Learning TechniquesWalking upstairs, downstairs, sitting, standing, layingDecision Tree (DT), Support Vector Machine (SVM), Random Forest (RF), and Artificial Neural Network (ANN)Accelerometer, Gyroscope

[22]Human Activity Recognition using SmartphoneWalking, limping, jogging, upstairs, and downstairsQuadratic, KNN, SVM, ANNAccelerometer

[23]A New Collection ELM for Human Activity Recognition Using Smartphone SensorsWalking, fast walking, upstairs, downstairs, and runningGaussian random projection, ANN, SVM, ELM, RF, and deep long short-term memoryAccelerometer, Gyroscope
From the literature review, it is clear that with human activity recognition using smartphone sensors, the researchers have just used accelerometer, gyroscope, and magnetometer sensors for a specific set of activities recognition, detection, and classification that includes walking, standing, sitting, lying, and going upstairs, and downstairs. The Propose Architecture for mobile sensors based platform of Human Physical Activities Recognition for COVID-19 pandemic spread minimization. Placement of Smartphone is (arm Position) for activity data collection and recognition. Accelerometer and Gyroscope sensors data plot when two different person perform same activity (-axis: No of samples, -axis: frequency). Accelerometer and Gyroscope Sensor response comparison by performing similar activity by different person (-axis: No of samples, -axis: frequency). Accelerometer sensor response for all activities (-axis: No of samples, -axis: frequency). Gyroscope sensor data plot for all activities (-axis: No of samples, -axis: frequency). The Classification algorithms for mobile sensors based platform of Human Physical Activities Recognition for COVID-19 pandemic spread minimization.

The proposed platform for mobile sensors based COVID-19 related activities recognition and contact tracing

Fig. 2 shows the proposed architecture for the activity recognition, monitoring, and classification in everyday and pandemic situations for minimizing the spread of COVID-19 using artificial intelligence (AI). The proposed architecture consists of developing a mobile app using the flutter mobile app development platform that collects a dataset of multiple sensors. With the help of the mobile application, we collect various sensor data, including GPS, Accelerometer, Gyroscope, and Speed. We first store all sensor data in the local storage of the mobile app in a CSV format and then send it to the server for recognition, monitoring, and classification. We store sensors data in the table format on the server-side and then apply AI techniques to this dataset.
Fig. 2

The Propose Architecture for mobile sensors based platform of Human Physical Activities Recognition for COVID-19 pandemic spread minimization.

Client application development

We develop the mobile app with the help of the Flutter development platform. It includes language selection, user registration, login screen, one-time PIN code (OTP) email verification, activity recognition, and contact tracing. The user needs to register for the contact tracing and activity recognition between two individuals. Registration collects the user’s basic information, including username, full name, age, email address, gender, phone number, and password; after the registration, a user needs to verify their email address with the help of a one-time PIN code (OTP) verification. After email verification, the user gets to the main screen, and permission is required for storing CSV files and acquiring GPS data. The user needs to allow the app to store their sensor data and get their location. Sensors, including accelerometer, gyroscope, speed, and GPS interface in this app, and all these sensors’ data are stored locally on the mobile device in the CSV format.

Dataset description

We collect the novel dataset (KAU-COVID19-AR-Dataset) by performing multiple activities, including handwashing, hand sanitizing, handshaking, touching the nose–eyes, sitting, standing, and walking, and drinking water. For the dataset collection and higher accuracy, the smartphone position is critical. We place the smartphone in a position like an aware watch on the hand. Fig. 3 shows the smartphone’s position placement. The smartphone placement position is similar to that of a smartwatch which achieved market penetration in the past [26]. It is important to note that the smartphone’s position on participants’ bodies is fixed. For these experiments, we used different companies’ smartphones. The orientation of the smartphones was lengthwise along the forearm. We recorded the data for all activities at ten samples per second. This sampling rate (10 samples per second) is enough to recognize human physical activities [27]. Moreover, due to recent advances, frequencies lower than ten samples per second are sufficient for activity recognition [9], [10]. We obtain the accelerometer, gyroscope, and speed sensors data with a frequency of ten samples per second and for the GPS sensor data at the frequency of one sample per second [28]. Table 2 shows the activities records. Fig. 4 shows the accelerometer and gyroscope sensor data plot for the two different persons performing the same activity. The pattern of the plot is almost similar. Fig. 5 shows the accelerometer and gyroscope sensor data plot for the same person’s two different activities, and the patterns are different. Fig. 6 shows the accelerometer sensor data plot, and Fig. 7 shows the gyroscope sensors data plot for all activities.
Fig. 3

Placement of Smartphone is (arm Position) for activity data collection and recognition.

Table 2

Dataset (KAU-COVID19-AR-Dataset): Activities, No of samples.

ActivitiesNo of samples
Walking15866
Handwashing6422
Standing5665
Sitting2869
Hand Sanitizing2105
Nose–Eyes Touching1770
Hand Shake1404
Drink water1397
Fig. 4

Accelerometer and Gyroscope sensors data plot when two different person perform same activity (-axis: No of samples, -axis: frequency).

Fig. 5

Accelerometer and Gyroscope Sensor response comparison by performing similar activity by different person (-axis: No of samples, -axis: frequency).

Fig. 6

Accelerometer sensor response for all activities (-axis: No of samples, -axis: frequency).

Fig. 7

Gyroscope sensor data plot for all activities (-axis: No of samples, -axis: frequency).

Dataset (KAU-COVID19-AR-Dataset): Activities, No of samples.

Transfer of mobile app data to the server

After storing the sensor data locally in a mobile device, we need it to the server for further processing, activity classification, and contact tracing. There are three options for uploading data to the server: backup, real-time, and periodic. Data is uploaded to a server every second in real-time and then deletes the local mobile CSV file. The problem with real-time data uploading is that data is lost when the internet is unavailable. For that reason, we store the data locally and send it when the internet is available. For periodic uploading, data is uploaded every 30 min periodically. In backup uploading, every 12 h, data is uploaded to a server, and then the CSV file is deleted after data is uploaded successfully. Our server on which we upload our data is PostgreSQL; all queries about data uploading and user registration/login are coded in NodeJS.

Feature extraction

Sensors and devices data include much-hidden information and noise. Feature extraction might discover helpful hidden information from the sample data. Furthermore, it might remove the noise in the sample data from the data gathering process or sensors. Choosing appropriate features will decrease the quantity of time and memory required by the classification method. Consequently, it might enhance the performance of the classification algorithm. A classification algorithm with the least classification time and memory is advantageous for executing activity recognition using smartphones. Feature extraction is an essential step in the development of any classifier [29]. An activity recognition system does not resolve the classification task directly on raw acceleration data. Usually, the classification is accomplished after an informative data representation in terms of feature vectors. The collected dataset consists of eight classes and three sensor datasets. For the feature extraction of the sensors dataset, we consider 2.5-second windows. First, window intersecting is accomplished by segmenting each class dataset individually into reduced subsets and then windowing them separately. We choose a window of 25 samples corresponding to 2.5 s of accelerometer, gyroscope, and speed data for feature extraction from each of the classes because it can take sufficient cycles in activities such as hand washing, hand sanitizing, handshaking. Determine the mean, median, variance, and standard deviation of each activity individually by applying the window size of 25 means every 2.5 s as the sample rate of data collection is ten samples in each second.

Classification algorithms

For the classification of each activity, we apply multiple classification algorithms, including Long Short-Term Memory (LSTM), Random Forest, Decision Tree, K-Nearest Neighbors (KNN), and Support Vector Machine (SVM), as shown in Fig. 8. We also apply some regression models to the sensor dataset, including Random Forest Regression and Decision Tree Regression.
Fig. 8

The Classification algorithms for mobile sensors based platform of Human Physical Activities Recognition for COVID-19 pandemic spread minimization.

Long Short-Term Memory (LSTM) models are a kind of recurrent neural network (RNN) able of learning order dependency in sequence prediction problems [30]. LSTM is a typical neural network (RNN) structure designed to model short-term sequence and its dependence over long-range more accurately than conventional RNN. LSTM networks are mainly for time series analysis. However, LSTM can be used as a classifier if we make the output layer a dense layer followed by an activation function. We use SoftMax as an activation function at the output layer. The number of activities is eight. Therefore, the number of neurons on the output layer is also eight. LSTM models are well-fitted for classifying, recognizing, and making predictions. The algorithm of is shown in Algorithm 1. Random Forest is a machine learning algorithm created on tree architecture that helps the proficiency of multiple decision trees for making decisions [31]. As the name represents, it is a “forest” of trees. The Random Forest Algorithm combines several (randomly produced) Decision Trees to make the final output. A decision tree is a flowchart-like structure in which a separate internal node demonstrates a “test” on an aspect, each section illuminates the result of the test, and every leaf node shows a class label (decision captured after processing all elements) [32]. K-Nearest Neighbor (KNN) model that supplies just the dataset from the training stage and while it achieves different data, it then classifies that data into a class that is associated (related) to the recent data [33]. For example, suppose we have sensors data of an accelerometer, gyroscope, and speed that looks like to a class of handwashing or handshaking, but we want to know if either it is a hand washing or handshaking activity. So, for this classification, we can use the KNN algorithm, as it is useful on a similarity measure. KNN model will locate the similar features of the recent data set to the handwashing and handshaking sensors data, and based on the most similar features, it will put it in either the hand washing or handshaking category. The SVM algorithm generates a line or a hyperplane which divides the data into classes [34]. For example, for the human detecting, classifying activities that are helpful in the minimizing of COVID-19 spread, dividing SVM sensors data into eight classes with the help of a line or a hyperplane divided the whole dataset into eight different categories [35].

Contact tracing using GPS sensor data for minimizing the COVID-19 spread

For the contact tracing, we used the GPS sensor [36] dataset; collected with the help of the developed mobile app. For the GPS sensor dataset, we used a one-second sample rate. We store the GPS sensor data locally on mobile devices and then send it to the server. After receiving the sensor data from the server, the distance between the two devices is calculated, and the coordinates of one device are compared with another devices’ GPS data in a radius of 10 m. Those devices in a radius of 10 m then check the activity state between the two users. With the help of an accelerometer, gyroscope, and speed sensor, we can recognize, monitor, and classify everyday activities between two people, e.g., two people sitting together, two people standing together, and two people walking together. After determining the activity between the two users then, the distance between two people is calculated using GPS sensor data [37]. In an indoor environment, the GPS position is not precision [38] and collecting the GPS sensor data when two people are sitting two gathers very close. We can calculate the distance by using the haversine formula [39]. For the accurate distance calculation using GPS data, we need some deep learning techniques. Table 3 shows the actual distance and distance calculated by the haversine formula when two people are sitting together. Table 4 shows the actual distance and distance calculated by the haversine formula when two people are standing together. Table 5 shows the actual distance and distance calculated by haversine formula when two people are walking together. Eq. (1) shows the haversine formula in which , , and shows the Latitude, Longitude, and radius of earth respectively.
Table 3

Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Sitting together in a two meters distance.

Lat 1Long 1Lat 2Long 2Actual distanceCalculated distance
37.6021337126.864865737.6021786126.864878525.11
37.6021337126.864865737.6021786126.864878525.11
37.6021337126.864865737.6021754126.86488725.00
37.6021337126.864865737.6021777126.864881125.07
37.6021337126.864865737.6021786126.864878525.11
37.6021337126.864865737.6021879126.86488126.17
37.6021337126.864865737.6021904126.86488426.50
37.6021337126.864865737.6021891126.864888426.47
Table 4

Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Standing together in a two meters distance.

Lat 1Long 1Lat 2Long 2Actual distanceCalculated distance
37.6021831126.86490437.6022178126.864944425.24
37.6022023126.864920537.602222126.864976325.38
37.6021998126.864920937.602222126.864976425.47
37.6021999126.864920837.6022221126.864976425.48
37.6021986126.864913537.602222126.864976326.11
37.6022004126.864905737.6022218126.864976126.64
37.6021985126.864905637.6022219126.864976126.73
37.6021983126.864905637.6022219126.864976126.74
Table 5

Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Walking together two meters apart.

Lat 1Long 1Lat 2Long 2Actual distanceCalculated distance
37.6027312126.865045537.6027326126.865058111.12
37.6025896126.864964137.6025962126.864952911.22
37.6025587126.864967237.6025551126.864944622.03
37.6025005126.864966337.6025105126.86494322.33
37.6024674126.864967637.6024692126.864940122.43
37.6027296126.86502637.6027326126.865058122.84
37.60261126.864971837.6026387126.864969423.19
37.602714126.865044137.6026872126.865062523.39
Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Sitting together in a two meters distance. Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Standing together in a two meters distance. Calculating distance using GPS Sensor Latitude, Longitude data for Person 1 and 2 Walking together two meters apart. Developed Mobile Application for the sensors data collection and contact tracing.

Results and discussions

In this section, we briefly introduce the developed mobile app to the COVID-19 activities recognition. Fig. 9 shows the mobile app that we developed using the Flutter app development platform. The main functionalities of the mobile app include user registration, user login, main screen, and backup functionalities like sensors data collection, storing data locally in a CSV file, sending data to a server, activity recognition, and contact tracing.
Fig. 9

Developed Mobile Application for the sensors data collection and contact tracing.

Fig. 10 shows the training accuracy, training loss, validation accuracy, and validation loss of the LSTM model. For the LSTM model, we get maximum accuracy in 150 epochs. For the classification algorithm accuracy measurement, we calculate each algorithm confusion matrix. LSTM confusion matrix shows a 90% true predicted value of Handwashing activity, 100% true predicted value of the walking activity, 94% true predicted value of Handshaking activity, 84% actual predicted value of Nose–eyes touching activity, 86% true predicted value of Hand sanitizing activity, 93% true predicted value of the sitting activity, 91% true predicted value of the standing activity, 80% true predicted value of drinking water activity as shown in Fig. 11. The LSTM confusion matrix also displays the precision and recall values that show each class precision and recall value obtained by LSTM.
Fig. 10

Long Short-Term Memory (LSTM) training loss, training accuracy, validation loss and validation accuracy.

Fig. 11

Long Short-Term Memory (LSTM) Confusion Matrix.

Long Short-Term Memory (LSTM) training loss, training accuracy, validation loss and validation accuracy. We set the number of parameters for the Random Forest classifier, including n-estimators, max-depth, and criterion. The n-estimators mean the number of trees in the forest, max-depth means the maximum depth of the trees, and criterion is the function to measure the quality of a split. For the random forest classifier, n-estimators 100, max-depth 100, criterion ‘entropy’ (Entropy for the measuring information gain). Random forest confusion matrix in which each class accuracy, precision, and recall value shows in Fig. 12. The confusion matrix shows the highest accuracy for each activity.
Fig. 12

Random Forest Confusion Matrix (90% training, 10% testing).

Long Short-Term Memory (LSTM) Confusion Matrix. Random Forest Confusion Matrix (90% training, 10% testing). For the Decision tree classifier, first set the number of parameters, including max-depth, and criterion. For the decision tree classifier, max-depth 200, criterion ‘entropy’ (Entropy for the measuring information gain). Decision tree confusion matrix in which each class accuracy, precision, and recall value is shown in Fig. 13.
Fig. 13

Decision Tree Confusion Matrix (90% training, 10% testing).

For the K-Nearest Neighbors (KNN) classifier, first set the n-neighbors (n-neighbors means Number of neighbors to use). KNN classifier confusion matrix shows each class’s accuracy, precision, and recall value in Fig. 14.
Fig. 14

K-Nearest Neighbors (KNN) Confusion Matrix (90% training, 10% testing).

Decision Tree Confusion Matrix (90% training, 10% testing). For the Support Vector Machine (SVM) classifier, first, specify the kernel type that is used for the SVM algorithm, we set the kernel ‘linear’. Support Vector Machine (SVM) classifier confusion matrix in which each class accuracy, precision, and recall value are shown in Fig. 15 .
Fig. 15

Support Vector Machine (SVM) Confusion Matrix (90% training, 10% testing).

K-Nearest Neighbors (KNN) Confusion Matrix (90% training, 10% testing). Support Vector Machine (SVM) Confusion Matrix (90% training, 10% testing). Performance summary of each algorithm for the KAU-COVID19-AR-Dataset (TR stands for Training & TE stands for Testing rate). The results for each model were tested by splitting the dataset into training and testing using different combinations, such as 90% training and 10% testing data, 80% training and 20% testing of the dataset, and 70% training and 30% testing dataset. Different classification algorithms were applied to each combination separately, and we obtained almost the same accuracy (greater than 90%) with all combinations of testing and training datasets. The results of all the split dataset combinations are shown in Table 6. Results include accuracy, mean absolute error (MAE), coefficient of determination, mean square error, and root mean square error. The accuracy of the classification algorithms are satisfactory; 92.90% for LSTM, random forest classifier, 97.33%, decision tree 90.88%, K-Nearest Neighbors (KNN) 96.00%, support vector machine 94.66%.
Table 6

Performance summary of each algorithm for the KAU-COVID19-AR-Dataset (TR stands for Training & TE stands for Testing rate).

Sr. NoModelAccuracy (%)
Mean absolute error
Coefficient of determination
TR = 70% & TE = 30%TR = 80% & TE = 20%TR = 90% & TE = 10%TR = 70% & TE = 30%TR = 80% & TE = 20%TR = 90% & TE = 10%TR = 70% & TE = 30%TR = 80% & TE = 20%TR = 90% & TE = 10%
1Long Short-Term Memory (LSTM)91.2591.3392.900.0310.0300.0280.6800.6780.699
2Random forest classifier91.5595.0097.330.2290.1230.0800.7350.9050.927
3Decision tree89.3390.6692.660.2070.1600.2040.8600.9130.826
4K-Nearest Neighbors (KNN)93.3394.2296.000.1630.1490.0730.8290.8420.952
5Support Vector Machine (SVM)93.3394.0094.660.1620.1530.1270.8420.8430.866
6Random Forest Regression86.0084.0082.660.3780.4000.3870.6600.6590.739
7Decision Tree Regression88.8887.0088.000.2670.3070.2870.7110.7270.760
In this paper, we developed a system for COVID-19 related physical activities using mobile sensors. Evaluation of the results showed that most of the activities were recognized correctly, eight of them averaging an accuracy of 91.77% and random forest classifier perform more accurately as compared to other algorithms with accuracy of 97.33%.

Conclusion and future work

In this paper, we proposed a mobile sensors-based platform for human physical activities recognition recommended by WHO and helpful in minimizing COVID-19 spread. We developed a mobile app using the Flutter development platform for acquiring various mobile sensors for COVID-19-related physical activities recognition and contact tracing for spread minimization. The app’s main functionalities include user registration, user login, backup functionalities like sensors data collection, storing data locally in a CSV file, sending data to a server, and contact tracing. We provide a novel dataset KAU-COVID19-AR-Dataset of accelerometer, gyroscope, GPS, and Speed sensor, by performing some specific activities that are helpful in the minimization of COVID-19 spread. We performed and compared various classifiers, LSTM, Random Forest, Decision Tree, KNN, and two regression algorithms for the validity of the dataset. All the classifiers attain an accuracy greater than 90% for all data split and cross-fold validation. Random Forest achieved a higher accuracy of 97.33%. The proposed system provides contact tracing at the top of data collection and activity recognition, such as two-person sitting together. The first step is detecting specific activities, with the help of activities recognition and classification algorithms, and then checking GPS location whether the activity is individual or group activity. We detect the proximity of another person with the COVID-19 suspect with the help of GPS sensor data considering their positions. If the distance is less than two meters, we calculate the time they have spent at this distance, how much time between these two-person less than two-meter distance, and check which they performed activity or activities to calculate the possibility of infection. Since the GPS has low precision and accuracy and is also an availability issue in the indoor environment, so in future work, we are focusing on the fusion of other Received Signal Strength-based proximity sensors such as Bluetooth and WiFi for improving the accuracy of contact tracing.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Algorithm 1: Long Short-Term Memory (LSTM)

Input: Number of classes (n-classes = 8), Number of features (n-features = 7 (accelerometer (x-axis, y-axis, z-axis), gyroscope (x-axis, y-axis, z-axis), speed)), Number of time steps (n time steps = 25 (2.5 seconds))
Output: Prediction of each class (human physical activities)

Procedure:
Set ip units, lstm units, op units and optimizer to define LSTM Network (L)
Building Model Architecture: Model: ”Sequential”
Normalize the dataset (Di) into values from 0 to 1
Select training window size (tw) and organize Di accordingly
for n epochs and batch size do
Train the Network (L)
end for
Run Predictions using L
Calculate the loss function
  13 in total

1.  Comparison of machine learning techniques for the identification of human activities from inertial sensors available in a mobile device after the application of data imputation techniques.

Authors:  Ivan Miguel Pires; Faisal Hussain; Gonçalo Marques; Nuno M Garcia
Journal:  Comput Biol Med       Date:  2021-07-07       Impact factor: 4.589

2.  Smartphone-Based Activity Recognition for Indoor Localization Using a Convolutional Neural Network.

Authors:  Baoding Zhou; Jun Yang; Qingquan Li
Journal:  Sensors (Basel)       Date:  2019-02-01       Impact factor: 3.576

3.  A New System for Surveillance and Digital Contact Tracing for COVID-19: Spatiotemporal Reporting Over Network and GPS.

Authors:  Shaoxiong Wang; Shuizi Ding; Li Xiong
Journal:  JMIR Mhealth Uhealth       Date:  2020-06-10       Impact factor: 4.773

Review 4.  COVID-19 infection: Origin, transmission, and characteristics of human coronaviruses.

Authors:  Muhammad Adnan Shereen; Suliman Khan; Abeer Kazmi; Nadia Bashir; Rabeea Siddique
Journal:  J Adv Res       Date:  2020-03-16       Impact factor: 10.479

5.  LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes.

Authors:  Sakorn Mekruksavanich; Anuchit Jitpattanakul
Journal:  Sensors (Basel)       Date:  2021-02-26       Impact factor: 3.576

6.  Wearable-Sensors-Based Platform for Gesture Recognition of Autism Spectrum Disorder Children Using Machine Learning Algorithms.

Authors:  Uzma Abid Siddiqui; Farman Ullah; Asif Iqbal; Ajmal Khan; Rehmat Ullah; Sheroz Paracha; Hassan Shahzad; Kyung-Sup Kwak
Journal:  Sensors (Basel)       Date:  2021-05-11       Impact factor: 3.576

7.  Fusion of smartphone motion sensors for physical activity recognition.

Authors:  Muhammad Shoaib; Stephan Bosch; Ozlem Durmaz Incel; Hans Scholten; Paul J M Havinga
Journal:  Sensors (Basel)       Date:  2014-06-10       Impact factor: 3.576

Review 8.  COVID-19: Prevention and control measures in community

Authors:  Rahmet Güner; Imran Hasanoğlu; Firdevs Aktaş
Journal:  Turk J Med Sci       Date:  2020-04-21       Impact factor: 0.973

9.  WHO Declares COVID-19 a Pandemic.

Authors:  Domenico Cucinotta; Maurizio Vanelli
Journal:  Acta Biomed       Date:  2020-03-19

10.  COVID-19 Mobile Apps: A Systematic Review of the Literature.

Authors:  Haridimos Kondylakis; Dimitrios G Katehakis; Angelina Kouroubali; Fokion Logothetidis; Andreas Triantafyllidis; Ilias Kalamaras; Konstantinos Votis; Dimitrios Tzovaras
Journal:  J Med Internet Res       Date:  2020-12-09       Impact factor: 5.428

View more
  1 in total

Review 1.  Human Activity Recognition: Review, Taxonomy and Open Challenges.

Authors:  Muhammad Haseeb Arshad; Muhammad Bilal; Abdullah Gani
Journal:  Sensors (Basel)       Date:  2022-08-27       Impact factor: 3.847

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.