| Literature DB >> 26329639 |
Oresti Banos, Claudia Villalonga, Rafael Garcia, Alejandro Saez, Miguel Damas, Juan A Holgado-Terriza, Sungyong Lee, Hector Pomares, Ignacio Rojas.
Abstract
The delivery of healthcare services has experienced tremendous changes during the last years. Mobile health or mHealth is a key engine of advance in the forefront of this revolution. Although there exists a growing development of mobile health applications, there is a lack of tools specifically devised for their implementation. This work presents mHealthDroid, an open source Android implementation of a mHealth Framework designed to facilitate the rapid and easy development of mHealth and biomedical apps. The framework is particularly planned to leverage the potential of mobile devices such as smartphones or tablets, wearable sensors and portable biomedical systems. These devices are increasingly used for the monitoring and delivery of personal health care and wellbeing. The framework implements several functionalities to support resource and communication abstraction, biomedical data acquisition, health knowledge extraction, persistent data storage, adaptive visualization, system management and value-added services such as intelligent alerts, recommendations and guidelines. An exemplary application is also presented along this work to demonstrate the potential of mHealthDroid. This app is used to investigate on the analysis of human behavior, which is considered to be one of the most prominent areas in mHealth. An accurate activity recognition model is developed and successfully validated in both offline and online conditions.Entities:
Mesh:
Year: 2015 PMID: 26329639 PMCID: PMC4547155 DOI: 10.1186/1475-925X-14-S2-S6
Source DB: PubMed Journal: Biomed Eng Online ISSN: 1475-925X Impact factor: 2.819
Figure 1mHealth Framework.
Figure 2Examples of representation modes supported by mHealthApp. (Left) Tri- axial acceleration signals are represented at runtime. (Right) Monthly average heart rate data is depicted on the top, while continuous 2-leads ECG signals are plotted at the bottom.
Figure 3Examples of video tutorials and guidelines for (.
Figure 4Snapshots from the activity recognition functionality of mHealthApp: (.
Activity set.
| Activity Set | |
|---|---|
| L1: Standing still (1 min) | L7: Frontal elevation of arms (20 |
| L2: Sitting and relaxing (1 min) | L8: Knees bending (crouching) (20 |
| L3: Lying down (1 min) | L9: Cycling (1 min) |
| L4: Walking (1 min) | L10: Jogging (1 min) |
| L5: Climbing/descending stairs (1 min) | L11: Running (1 min) |
| L6: Waist bends forward (20 | L12: Jump front & back (20 |
In brackets are the number of repetitions (N×) or the duration of the exercises (min).
Figure 5Study setup and sensor deployment.
Figure 6Confusion matrix obtained from the offline evaluation of the activity recognition model. Activities are identified through the labels introduced in Table 1.
Recognition performance for each activity class for the offline evaluation.
| Activity | SE | SP | PPV | NPV | F-score |
|---|---|---|---|---|---|
| L1 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L2 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L3 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L4 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L5 | 0.99 | 1.00 | 0.99 | 1.00 | 0.99 |
| L6 | 0.97 | 1.00 | 0.97 | 1.00 | 0.97 |
| L7 | 1.00 | 1.00 | 0.99 | 1.00 | 0.99 |
| L8 | 0.95 | 1.00 | 0.97 | 1.00 | 0.96 |
| L9 | 1.00 | 1.00 | 0.99 | 1.00 | 1.00 |
| L10 | 0.96 | 0.99 | 0.94 | 1.00 | 0.95 |
| L11 | 0.94 | 1.00 | 0.96 | 0.99 | 0.95 |
| L12 | 0.99 | 1.00 | 0.99 | 1.00 | 0.99 |
Each metric respectively correspond to the sensitivity (SE), specificity (SP), positive predictive value (PPV), negative predictive value (NPV) and F-score.
Figure 7Activities detected by the proposed recognizer during online evaluation of the system and for various subjects. The actual activities are represented by the ground-truth labels. Activities are identified through the labels introduced in Table 1.
Figure 8Confusion matrix obtained from the online evaluation of the activity recognition model. Activities are identified through the labels used in Table 1.
Recognition performance for each activity class for the online evaluation.
| Activity | SE | SP | PPV | NPV | F-score |
|---|---|---|---|---|---|
| L1 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L2 | 0.93 | 1.00 | 1.00 | 0.99 | 0.97 |
| L3 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L4 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L5 | 1.00 | 0.99 | 0.88 | 1.00 | 0.94 |
| L6 | 1.00 | 0.99 | 0.91 | 1.00 | 0.95 |
| L7 | 1.00 | 0.99 | 0.94 | 1.00 | 0.97 |
| L8 | 0.90 | 1.00 | 1.00 | 0.99 | 0.95 |
| L9 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L10 | 0.87 | 1.00 | 1.00 | 0.99 | 0.93 |
| L11 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
| L12 | 1.00 | 1.00 | 1.00 | 1.00 | 1.00 |
Each metric respectively correspond to the specificity (SP), sensitivity (SE), positive predictive value (PPV), negative predictive value (NPV) and F-score.