| Literature DB >> 30678039 |
Robert-Andrei Voicu1, Ciprian Dobre2, Lidia Bajenaru3, Radu-Ioan Ciobanu4.
Abstract
Because the number of elderly people is predicted to increase quickly in the upcoming years, "aging in place" (which refers to living at home regardless of age and other factors) is becoming an important topic in the area of ambient assisted living. Therefore, in this paper, we propose a human physical activity recognition system based on data collected from smartphone sensors. The proposed approach implies developing a classifier using three sensors available on a smartphone: accelerometer, gyroscope, and gravity sensor. We have chosen to implement our solution on mobile phones because they are ubiquitous and do not require the subjects to carry additional sensors that might impede their activities. For our proposal, we target walking, running, sitting, standing, ascending, and descending stairs. We evaluate the solution against two datasets (an internal one collected by us and an external one) with great effect. Results show good accuracy for recognizing all six activities, with especially good results obtained for walking, running, sitting, and standing. The system is fully implemented on a mobile device as an Android application.Entities:
Keywords: activity recognition; ambient assisted living; machine learning; smartphones
Mesh:
Year: 2019 PMID: 30678039 PMCID: PMC6386882 DOI: 10.3390/s19030458
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Sensors registered in each dataset.
| Dataset | Sensors |
|---|---|
| Internal dataset | Accelerometer, Gyroscope, Gravity |
| External dataset | Accelerometer, Gravity, Linear Acceleration, Magnetometer |
Figure 1Main screen of the data collection application, before and after recording data.
Average number of sensor readings per 10-s intervals.
| Accelerometer | Gyroscope | Gravity |
|---|---|---|
| 164 | 124 | 124 |
Figure 2Graphical data gathered from each sensor while walking.
Figure 3Graphical data gathered from each sensor while running.
Figure 4Graphical data gathered from each sensor while sitting.
Figure 5Graphical data gathered from each sensor while standing.
Figure 6Graphical data gathered from each sensor while climbing stairs.
Figure 7Graphical data gathered from each sensor while going down the stairs.
Confusion matrix for the internal training set.
| Performed Activity | Recognized Activity | |||||
|---|---|---|---|---|---|---|
| Walking | Running | Sitting | Standing | Upstairs | Downstairs | |
|
|
| 1 | 0 | 1 | 3 | 2 |
|
| 2 |
| 0 | 1 | 3 | 3 |
|
| 1 | 0 |
| 3 | 1 | 1 |
|
| 2 | 1 | 2 |
| 1 | 1 |
|
| 8 | 2 | 1 | 4 |
| 10 |
|
| 7 | 2 | 2 | 6 | 12 |
|
|
| 113 | 97 | 99 | 108 | 95 | 88 |
Confusion matrix for the external training set.
| Performed Activity | Recognized Activity | |||||
|---|---|---|---|---|---|---|
| Walking | Running | Sitting | Standing | Upstairs | Downstairs | |
|
|
| 4 | 0 | 2 | 6 | 3 |
|
| 5 |
| 2 | 3 | 7 | 5 |
|
| 2 | 2 |
| 6 | 2 | 1 |
|
| 5 | 1 | 4 |
| 4 | 2 |
|
| 12 | 4 | 1 | 5 |
| 13 |
|
| 9 | 3 | 2 | 6 | 18 |
|
|
| 118 | 92 | 96 | 106 | 102 | 86 |
Activity recognition accuracy over the internal dataset.
| Activity | Walking | Running | Sitting | Standing | Upstairs | Downstairs | Average |
|---|---|---|---|---|---|---|---|
|
| 93% | 91% | 94% | 93% | 75% | 71% | 86.1% |
Activity recognition accuracy over the external dataset.
| Activity | Walking | Running | Sitting | Standing | Upstairs | Downstairs | Average |
|---|---|---|---|---|---|---|---|
|
| 85% | 78% | 87% | 84% | 65% | 62% | 76.8% |
Scenarios for activity recognition solutions’ comparison. The internal implementation refers to the solution we proposed here (using MLP), while the external implementation refers to the solution proposed by Shoaib et al. [14] that used naive Bayes. The external solution used a dataset defined in the same paper, whereas we used first our own dataset (Scenario 7) and then the external dataset.
| Scenario | Description |
|---|---|
| 1 | External implementation, 5-s window, accelerometer at wrist |
| 2 | External implementation, 5-s window, accelerometer and gyroscope at wrist |
| 3 | External implementation, 5-s window, accelerometer and gyroscope at wrist and in pocket |
| 4 | External implementation, 2-s window, accelerometer and gyroscope at wrist |
| 5 | External implementation, 15-s window, accelerometer and gyroscope at wrist |
| 6 | External implementation, 30-s window, accelerometer and gyroscope at wrist |
| 7 | Internal implementation, 10-s window, smartphone sensors |
| 8 | Internal implementation and external dataset, 10-s window, smartphone sensors |
Accuracy comparison between our solution and an external solution [14].
| Scenario | Walking | Running | Sitting | Standing | Upstairs | Downstairs | Overall |
|---|---|---|---|---|---|---|---|
| 1 | 52% | 100% | 93% | 97% | 48% | 74% | 77% |
| 2 | 79% | 100% | 92% | 96% | 74% | 93% | 89% |
| 3 | 85% | 100% | 91% | 92% | 96% | 98% | 94% |
| 4 | 74% | 100% | 94% | 96% | 60% | 81% | 84% |
| 5 | 92% | 100% | 89% | 93% | 93% | 98% | 94% |
| 6 | 100% | 100% | 90% | 93% | 97% | 100% | 97% |
| 7 | 93% | 91% | 94% | 93% | 75% | 71% | 86% |
| 8 | 85% | 78% | 87% | 84% | 65% | 62% | 77% |