| Literature DB >> 30314352 |
Gabriela Cajamarca1, Iyubanit Rodríguez2, Valeria Herskovic3, Mauricio Campos4, Juan Carlos Riofrío5.
Abstract
Monitoring the posture of older persons using portable sensors while they carry out daily activities can facilitate the process of generating indicators with which to evaluate their health and quality of life. The majority of current research into such sensors focuses primarily on their functionality and accuracy, and minimal effort is dedicated to understanding the experience of older persons who interact with the devices. This study proposes a wearable device to identify the bodily postures of older persons, while also looking into the perceptions of the users. For the purposes of this study, thirty independent and semi-independent older persons undertook eight different types of physical activity, including: walking, raising arms, lowering arms, leaning forward, sitting, sitting upright, transitioning from standing to sitting, and transitioning from sitting to standing. The data was classified offline, achieving an accuracy of 93.5%, while overall device user perception was positive. Participants rated the usability of the device, in addition to their overall user experience, highly.Entities:
Keywords: older users; user experience; wearable
Mesh:
Year: 2018 PMID: 30314352 PMCID: PMC6210183 DOI: 10.3390/s18103409
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1StraightenUp design: (a) phase I; (b) phase II.
Figure 2General architecture of the StraightenUp system.
Figure 3All eight classified postures.
Figure 4Distribution of data on the x-axis for each sensor during the monitoring of torsos while residents carry out specific activities.
Confusion matrix for activities.
| Wlk | Tr1 | Lng | Sit | Rsn | Lwr | StU | Tr2 | TP Rate % | Precision % | |
|---|---|---|---|---|---|---|---|---|---|---|
| Wlk | 3158 | 6 | 25 | 6 | 4 | 3 | 4 | 3 | 98.4% | 93.1% |
| Tr1 | 43 | 289 | 2 | 41 | 1 | 0 | 2 | 9 | 74.7% | 85.8% |
| Lng | 60 | 2 | 772 | 0 | 10 | 5 | 3 | 0 | 90.6% | 93.1% |
| Sit | 34 | 12 | 5 | 1213 | 2 | 2 | 1 | 4 | 95.3% | 94.0% |
| Rsn | 12 | 3 | 14 | 1 | 709 | 9 | 16 | 0 | 92.8% | 95.1% |
| Lwr | 18 | 1 | 4 | 5 | 7 | 443 | 4 | 1 | 91.7% | 96.4% |
| StU | 18 | 1 | 5 | 2 | 9 | 2 | 882 | 0 | 96% | 90.3% |
| Tr2 | 49 | 23 | 2 | 23 | 5 | 2 | 3 | 159 | 59.8% | 93.5% |
Accuracy according to the number of sensors used (X means that the corresponding sensor was considered in the classification).
| S1 | S2 | S3 | Correctly Classified Instances % |
|---|---|---|---|
| X | - | - | 73.7 % |
| - | X | - | 76.6 % |
| - | - | X | 76.8 % |
| X | X | - | 90.1 % |
| - | X | X | 90.1 % |
| X | - | X | 90.4 % |
| X | X | X | 93.5 % |
Figure 5Average values for pragmatic quality (PQ), hedonic quality identification (HQ-I), hedonic quality stimulation (HQ-S), and attractiveness (ATT).