| Literature DB >> 26690450 |
Ferhat Attal1, Samer Mohammed2, Mariam Dedabrishvili3, Faicel Chamroukhi4, Latifa Oukhellou5, Yacine Amirat6.
Abstract
This paper presents a review of different classification techniques used to recognize human activities from wearable inertial sensor data. Three inertial sensor units were used in this study and were worn by healthy subjects at key points of upper/lower body limbs (chest, right thigh and left ankle). Three main steps describe the activity recognition process: sensors' placement, data pre-processing and data classification. Four supervised classification techniques namely, k-Nearest Neighbor (k-NN), Support Vector Machines (SVM), Gaussian Mixture Models (GMM), and Random Forest (RF) as well as three unsupervised classification techniques namely, k-Means, Gaussian mixture models (GMM) and Hidden Markov Model (HMM), are compared in terms of correct classification rate, F-measure, recall, precision, and specificity. Raw data and extracted features are used separately as inputs of each classifier. The feature selection is performed using a wrapper approach based on the RF algorithm. Based on our experiments, the results obtained show that the k-NN classifier provides the best performance compared to other supervised classification algorithms, whereas the HMM classifier is the one that gives the best results among unsupervised classification algorithms. This comparison highlights which approach gives better performance in both supervised and unsupervised contexts. It should be noted that the obtained results are limited to the context of this study, which concerns the classification of the main daily living human activities using three wearable accelerometers placed at the chest, right shank and left ankle of the subject.Entities:
Keywords: accelerometers; activity recognition; data classifiers; physical activities; smart spaces; wearable sensors
Mesh:
Year: 2015 PMID: 26690450 PMCID: PMC4721778 DOI: 10.3390/s151229858
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Remote health monitoring architecture based on wearable sensors.
Figure 2Graphical illustration of wearable sensor placement.
Review of studies on accelerometer placement for human activity recognition.
| Reference | Placement ofAccelerometers | Detected Activities | Average (%) ofClassification Accuracy |
|---|---|---|---|
| Karantonis | Waist | Walking, Falling | 90.8% |
| Mathie, 2004 [ | Waist | Falling, Walking, Sitting, Standing, Lying | 98.9% |
| Yang | Wrist | Walking, Running, Scrubbing, Standing, Working at a PC, Vacuuming, Brushing teeth Sitting | 95% |
| Pirttikangas, 2006 [ | Thigh, Necklace, Wrists | Typing, Watching TV, Drinking, Stairs Ascent and Descent | 91.5% |
| Parkka, 2006 [ | Wrist, Chest | Lying, Sitting, Walking, Rowing And Cycling | 83.3% |
| Olguın, 2006 [ | Wrist, Chest, Hip | Sitting, Running, Walking, Standing, Lying, Crawling | 92.13% |
| Bonomi, 2009 [ | Lower Back | Lying, Sitting, Standing, Working on a Computer, Walking, Running, Cycling | 93% |
| Yeoh, 2008 [ | Thigh, Waist | Sitting, Lying, Standing And Walking Speed | 100% |
| Lyons, 2005 [ | Thigh, Trunk | Sitting, Standing, Lying, Moving | 92.25% |
| Salarian | Trunk , shanks (IMU sensor) | 14 daily living activities | - |
| Gjoreski, 2011 [ | Thigh, Waist, Chest, Ankle | Lying, Sitting, Standing, All Fours, Transitional | 91% |
| Chamroukhi, 2013 [ | Chest, Thigh, Ankle | Stairs Ascent and Descent, Walking, Sitting, Standing Up, Sitting on the Ground | 90.3% |
| Bayat | pocket, Hand | Slow Walking, Fast Walking, Running, Stairs-Up, Stairs-Down, and Dancing | 91.15% |
| Moncada-Torres, 2014 [ | Chest, Thigh, Ankle | 16 daily living activities | 89.08% |
| Gupta | Waist | walking, jumping, running, sit-to-stand/stand-to-sit, stand-to-kneel-to-stand, and being stationary | 98% |
| Garcia-Ceja | Wrist | long-term activities (Shopping, Showering, Dinner, Working, Commuting, Brush Teeth) | 98% |
| Gao | Chest, waist, thigh, side | standing, sitting, lying, walking and transition | 96.4% |
| Massé | Trunk (IMU and barometric pressure sensor) | sitting, standing, walking, lying | 90.4% |
Figure 3Steps of human activity recognition.
Figure 4MTx-Xbus inertial tracker and sensors placement.
List of the selected activities (A1…A12).
| Activity Reference | Description of Activity |
|---|---|
| A1 | Stair descent |
| A2 | Standing |
| A3 | Sitting down |
| A4 | Sitting |
| A5 | From sitting to sitting on the ground |
| A6 | Sitting on the ground |
| A7 | Lying down |
| A8 | Lying |
| A9 | From lying to sitting on the ground |
| A10 | Standing up |
| A11 | Walking |
| A12 | Stair ascent |
Figure 5Representation of the number of samples in each class for each sequence.
Performances of the supervised algorithms using raw data.
| Accuracy ± std | F-measure | Recall | Precision | Specificity | |
|---|---|---|---|---|---|
| k-NN (%) | 96.53 ± 0.20 | 94.60 | 94.57 | 94.62 | 99.67 |
| RF (%) | 94.89 ± 0.57 | 82.87 | 82.28 | 83.46 | 99.43 |
| SVM (%) | 94.22 ± 0.28 | 90.66 | 90.98 | 90.33 | 99.56 |
| SLGMM (%) | 84.54 ± 0.30 | 69.94 | 69.99 | 69.88 | 98.39 |
Performance results of the unsupervised algorithms using raw data.
| Accuracy ± std | F-measure | Recall | Precision | Specificity | |
|---|---|---|---|---|---|
| HMM (%) | |||||
| K-means (%) | 68.42 ± 5.05 | 49.89 | 48.67 | 48.55 | |
| GMM (%) | 73.60 ± 2.32 | 57.68 | 57.54 | 58.82 |
Global confusion matrix obtained with k-NN using raw data.
| Obtained | Classes | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 88.98 | 0.41 | 0.04 | 0 | 0.04 | 0 | 0 | 0 | 0 | 0.78 | 4.34 | 5.41 | ||
| 0.40 | 98.52 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0.21 | 0.56 | 0.23 | ||
| 0.21 | 0.64 | 95.73 | 0.53 | 0.64 | 0 | 0 | 0 | 0 | 0.96 | 0.85 | 0.43 | ||
| 0 | 0 | 0.77 | 98.92 | 0.31 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ||
| 0.08 | 0 | 0.55 | 0.16 | 97.98 | 0.47 | 0.08 | 0 | 0.16 | 0.55 | 0 | 0 | ||
| 0 | 0 | 0 | 0 | 0.22 | 99.41 | 0.03 | 0 | 0.25 | 0.08 | 0 | 0 | ||
| 0 | 0 | 0 | 0 | 0.22 | 0.15 | 95.71 | 1.53 | 2.33 | 0.07 | 0 | 0 | ||
| 0 | 0 | 0 | 0 | 0 | 0 | 1.58 | 97.62 | 0.80 | 0 | 0 | 0 | ||
| 0 | 0 | 0 | 0 | 0.25 | 0.34 | 3.96 | 0.67 | 94.44 | 0.34 | 0 | 0 | ||
| 1.58 | 0.46 | 0.19 | 0 | 0.65 | 0.28 | 0 | 0 | 0.19 | 94.07 | 0.93 | 1.67 | ||
| 4.07 | 0.41 | 0.03 | 0 | 0 | 0 | 0 | 0 | 0 | 0.55 | 92.57 | 2.37 | ||
| 5.05 | 0.43 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1.03 | 3.08 | 90.42 |
Global confusion matrix obtained with HMM using raw data.
| Obtained | Classes | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 55.33 | 1.70 | 1.08 | 0 | 0.62 | 0 | 0 | 0 | 0 | 3.19 | 23.52 | 14.57 | ||
| 2.83 | 86.22 | 0.47 | 0 | 0 | 0 | 0 | 0 | 0 | 1.50 | 6.97 | 2.01 | ||
| 0.12 | 0 | 39.86 | 32.82 | 12.53 | 0 | 0 | 0 | 0 | 10.62 | 0.24 | 3.82 | ||
| 0.10 | 0 | 9.58 | 87.21 | 3.11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ||
| 0.67 | 0 | 7.20 | 0.29 | 73.61 | 0.10 | 1.06 | 0 | 1.44 | 15.55 | 0 | 0.10 | ||
| 0 | 0 | 0 | 0 | 3.15 | 91.63 | 0.88 | 0 | 2.18 | 2.16 | 0 | 0 | ||
| 0 | 0 | 0 | 0 | 2.24 | 0.50 | 29.74 | 35.33 | 27.95 | 4.25 | 0 | 0 | ||
| 0 | 0 | 0 | 0 | 0 | 0 | 13.14 | 81.38 | 5.48 | 0 | 0 | |||
| 0 | 0 | 0 | 0 | 2.13 | 0 | 37.03 | 16.70 | 33.75 | 10.39 | 0 | 0 | ||
| 0 | 0 | 0 | 0 | 9.20 | 0 | 0 | 0 | 1.15 | 89.66 | 0 | 0 | ||
| 19.59 | 1.38 | 2.53 | 0 | 0 | 0 | 0 | 0 | 0 | 2.38 | 56.95 | 17.17 | ||
| 16.65 | 0 | 3.72 | 0 | 2.44 | 0 | 0 | 0 | 0 | 5.75 | 11.10 | 60.34 |
Figure 6Steps of the activity recognition process using features’ extraction and selection.
Performances of the supervised algorithms using extracted features.
| Accuracy ± std | F-Measure | Recall | Precision | Specificity | |
|---|---|---|---|---|---|
| k-NN (%) | |||||
| RF (%) | 98.95 ± 0.09 | 98.27 | 98.24 | 98.25 | 99.90 |
| SVM (%) | 95.55 ± 0.30 | 93.02 | 93.15 | 92.90 | 99.92 |
| SLGMM (%) | 85.05 ± 0.57 | 73.44 | 74.44 | 73.61 | 99.88 |
Performances of the unsupervised algorithms using extracted features.
| Accuracy ± std | F-Measure | Recall | Precision | Specificity | |
|---|---|---|---|---|---|
| HMM (%) | |||||
| K-means (%) | 72.95 ± 2.80 | 50.29 | 52.20 | 51.22 | 97.04 |
| GMM (%) | 75.60 ± 1.25 | 65.00 | 66.29 | 64.30 | 97.12 |
Global confusion matrix obtained with k-NN using selected features.
| Obtained | Classes | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| A1 | A2 | A3 | A4 | A5 | A6 | A7 | A8 | A9 | A10 | A11 | A12 | ||
| A1 | 99.00 | 0.32 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.08 | 0.48 | 0.12 | |
| A2 | 0.06 | 99.75 | 0.04 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03 | 0.07 | 0.04 | |
| A3 | 0 | 0.43 | 99.15 | 0.43 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
| A4 | 0 | 0 | 0.11 | 99.79 | 0.11 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
| A5 | 0 | 0 | 0 | 0.23 | 99.38 | 0.23 | 0 | 0 | 0.08 | 0.08 | 0 | 0 | |
| A6 | 0 | 0 | 0 | 0 | 0.07 | 99.78 | 0.07 | 0.03 | 0.05 | 0 | 0 | ||
| A7 | 0 | 0 | 0 | 0 | 0 | 0.21 | 99.65 | 0.14 | 0 | 0 | 0 | 0 | |
| A8 | 0 | 0 | 0 | 0 | 0 | 0.15 | 99.79 | 0.06 | 0 | 0 | |||
| A9 | 0 | 0 | 0 | 0 | 0.08 | 0.17 | 0.33 | 99.42 | 0 | 0 | 0 | ||
| A10 | 0.35 | 0.18 | 0 | 0 | 0.09 | 0.09 | 0 | 0 | 0 | 99.20 | 0.09 | ||
| A11 | 0.22 | 0.17 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 99.34 | 0.28 | |
| A12 | 0.08 | 0.17 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04 | 0.25 | 99.45 |
Global confusion matrix obtained with HMM using selected features.
| Obtained | Classes | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| A1 | A2 | A3 | A4 | A5 | A6 | A7 | A8 | A9 | A10 | A11 | A12 | ||
| A1 | 57.74 | 0.06 | 0.43 | 0 | 0.31 | 0 | 0 | 0 | 0 | 4.07 | 20.17 | 17.21 | |
| A2 | 1.36 | 94.66 | 0.31 | 0 | 0 | 0 | 0 | 0 | 0 | 0.89 | 1.98 | 0.80 | |
| A3 | 3.82 | 0 | 55.30 | 5.69 | 15.42 | 0 | 0 | 0 | 0 | 1.64 | 4.91 | 13.24 | |
| A4 | 0 | 0 | 2.85 | 96.31 | 0.83 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
| A5 | 2.05 | 0 | 1.80 | 0.66 | 71.62 | 4.35 | 2.21 | 0 | 5.50 | 11.48 | 0 | 0.33 | |
| A6 | 0 | 0 | 0 | 0 | 1.39 | 97.09 | 0.30 | 0 | 0.94 | 0.28 | 0 | 0 | |
| A7 | 0 | 0 | 0 | 0 | 1.54 | 0 | 59.91 | 4.25 | 32.30 | 1.99 | 0 | 0 | |
| A8 | 0 | 0 | 0 | 0 | 0 | 0 | 3.30 | 94.69 | 2.01 | 0 | 0 | ||
| A9 | 0 | 0 | 0 | 0 | 4.02 | 1.75 | 32.68 | 0.10 | 50.41 | 11.03 | 0 | 0 | |
| A10 | 13.56 | 0 | 1.51 | 0 | 6.44 | 0 | 1.92 | 0 | 2.19 | 60.68 | 7.12 | 6.58 | |
| A11 | 19.87 | 4.45 | 1.50 | 0 | 0 | 0 | 0 | 0 | 0 | 3.45 | 57.02 | 13.73 | |
| A12 | 16.37 | 0.17 | 0 | 0 | 0.34 | 0 | 0 | 0 | 0 | 1.90 | 17.26 | 63.97 |