| Literature DB >> 29568309 |
Cheng Xu1,2, Jie He1,2, Xiaotong Zhang1,2, Haipiao Cai1,2, Shihong Duan1,2, Po-Hsuan Tseng3, Chong Li4.
Abstract
Motion related human activity recognition using wearable sensors can potentially enable various useful daily applications. So far, most studies view it as a stand-alone mathematical classification problem without considering the physical nature and temporal information of human motions. Consequently, they suffer from data dependencies and encounter the curse of dimension and the overfitting issue. Their models are hard to be intuitively understood. Given a specific motion set, if structured domain knowledge could be manually obtained, it could be used for better recognizing certain motions. In this study, we start from a deep analysis on natural physical properties and temporal recurrent transformation possibilities of human motions and then propose a useful Recurrent Transformation Prior Knowledge-based Decision Tree (RT-PKDT) model for recognition of specific human motions. RT-PKDT utilizes temporal information and hierarchical classification method, making the most of sensor streaming data and human knowledge to compensate the possible data inadequacy. The experiment results indicate that the proposed method performs superior to those adopted in related works, such as SVM, BP neural networks, and Bayesian Network, obtaining an accuracy of 96.68%.Entities:
Mesh:
Year: 2018 PMID: 29568309 PMCID: PMC5820668 DOI: 10.1155/2018/4160652
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1The conceptual motion model. Each motion can be viewed as a combination of five attributes: intensity, orientation, velocity, body-position, and duration.
Figure 2Boxplot of four features corresponded, respectively, to the attributes demonstrated in motion model. Typical features corresponded, respectively, to the attributes demonstrated in motion model and are calculated based on collected dataset. (a) is based on mean value of acceleration; (b) is based on the difference of pressure measurement value in a given time window; (c) is based on the variance of the acceleration; (d) is based on the pressure difference between Ankle and Shoulder.
Figure 3Prior Knowledge-based Decision Tree: A typical classification method according to commonly human sense.
Figure 4Conceptual transformation relationship between human motions listed in activity.
Possible Transitions between time t − 1 and time t. The first-order transition matrix is denoted as Trans1(t − 1, t).
| Trans1( | St[ | Ly[ | Eu[ | Ed[ | Up[ | Do[ | Wa[ | Ru[ | Tu[ |
|---|---|---|---|---|---|---|---|---|---|
| St[ | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 1 |
| Ly[ | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
| Eu[ | 1 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 |
| Ed[ | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 |
| Up[ | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 |
| Do[ | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 |
| Wa[ | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
| Ru[ | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 |
| Tu[ | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Figure 5Second-order transition schematic diagram. The possible transferability between motions with the constraints is demonstrated. (a) demonstrates the possible transitions among activities Standing, TurningSTL, and Lying. (b) demonstrates the possible transitions among activities Walking and Running. (c) demonstrates the possible transitions among activities Walking, Upstairs, and Downstairs. (d) demonstrates the possible transitions among activities ElevatorUp, Elevatordown, Standing, and Walking.
Figure 6RT-PKDT. At certain time t, the classification process is proceeded by PKDT method.
Algorithm 1Divergence-based Feature Selection Algorithm.
Figure 7Experimental Platform Settings. Each sensor unit is mounted onto body locations tagged by red circles. MCU and storage unit is located in place marked with blue box.
Classification accuracy (%).
| On collected data set | On public data set [ | |||||||
|---|---|---|---|---|---|---|---|---|
| SVM | BP | BayesianNet | RT-PKDT | SVM | BP | BayesianNet | RT-PKDT | |
| Standing | 97.71 | 96.90 | 94.67 |
| 92.76 | 95.55 | 89.35 |
|
| Lying |
| 99.88 | 98.56 |
|
| 99.88 | 98.55 | 99.63 |
| ElevatorUp | 92.37 | 99.15 |
| 94.49 | 90.33 | 93.55 |
| 96.21 |
| ClevatorDown | 88.44 | 83.56 |
| 93.78 | 90.98 | 91.12 | 92.88 |
|
| Upstairs | 94.12 | 18.82 | 81.18 |
| 93.32 | 89.56 | 84.88 |
|
| Downstairs | 83.1 | 69.01 | 83.10 |
| 89.55 | 78.43 | 88.21 |
|
| Walking |
| 90.14 | 96.14 | 91.87 |
| 92.98 | 91.33 | 93.22 |
| Running | 99.16 | 48.74 | 100 |
| 98.84 | 82.35 | 98.32 |
|
| Turning-St-Ly | 84.00 | 58.67 | 84.00 |
| 88.76 | 75.35 | 86.35 |
|
|
| ||||||||
| Average accuracy | 92.67 | 73.87 | 92.73 |
| 93.40 | 88.75 | 91.90 |
|
Comparisons with methods in other literatures.
| Method | Candidate motions | Sensors type | Sensors location | Accuracy |
|---|---|---|---|---|
| Decision tree [ | 25 actions, Stand-Sit, Sit-Lie, etc. | Accelerometer, gyroscope | 9, wrist, arm, ankle, etc. | 93.3% |
|
| 25 actions, Stand-Sit, Sit-Lie, etc. | Accelerometer, gyroscope | 8, waist, left-forearm, etc. | 92.2% |
| Neural Networks [ | 12 actions, Standing, Lying, etc. | Accelerometer | 5, left forearm, trunk, etc. | 89.2% |
| SVM [ | 8 actions, running, upstairs, etc. | Accelerometer, gyroscope, Magnetometer, barometer sensor | 1, hand | 88.6% |
| Bayesian Network [ | 7 actions, running, walking, etc. | Accelerometer, gyroscope, Magnetometer | 1, belt | 90% |
| proposed RT-PKDT | 8 actions, listed in | Accelerometer, gyroscope, barometer sensor | 5 body-positions, listed in |
|