| Literature DB >> 32414064 |
Muhammad Asif Razzaq1, Ian Cleland2, Chris Nugent2, Sungyoung Lee1.
Abstract
The recognition of activities of daily living (ADL) in smart environments is a well-known and an important research area, which presents the real-time state of humans in pervasive computing. The process of recognizing human activities generally involves deploying a set of obtrusive and unobtrusive sensors, pre-processing the raw data, and building classification models using machine learning (ML) algorithms. Integrating data from multiple sensors is a challenging task due to dynamic nature of data sources. This is further complicated due to semantic and syntactic differences in these data sources. These differences become even more complex if the data generated is imperfect, which ultimately has a direct impact on its usefulness in yielding an accurate classifier. In this study, we propose a semantic imputation framework to improve the quality of sensor data using ontology-based semantic similarity learning. This is achieved by identifying semantic correlations among sensor events through SPARQL queries, and by performing a time-series longitudinal imputation. Furthermore, we applied deep learning (DL) based artificial neural network (ANN) on public datasets to demonstrate the applicability and validity of the proposed approach. The results showed a higher accuracy with semantically imputed datasets using ANN. We also presented a detailed comparative analysis, comparing the results with the state-of-the-art from the literature. We found that our semantic imputed datasets improved the classification accuracy with 95.78% as a higher one thus proving the effectiveness and robustness of learned models.Entities:
Keywords: BLE; activity recognition; neural network; ontologies; proximity; segmentation; semantic imputation; unobtrusive sensing
Mesh:
Year: 2020 PMID: 32414064 PMCID: PMC7294435 DOI: 10.3390/s20102771
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Time series analysis for example Prepare breakfast in UCamI dataset [19].
Figure 2A detailed view of SemImput framework.
A list of activities, locations, and dependent sensor objects identified from UCamI dataset utilized for SemImputOnt constructs.
| Type | ID | Activity Name | Location | Activity Dependencies Sensors’ Objects |
|---|---|---|---|---|
| Static | Act01 | Take medication | Kitchen | Water bottle, MedicationBox |
| Dynamic | Act02 | Prepare breakfast | Kitchen, Dining room | Motion Sensor Bedroom, Sensor Kitchen Movement, |
| Dynamic | Act03 | Prepare lunch | Kitchen, Dining room | Motion Sensor Bedroom, Sensor Kitchen Movement, |
| Dynamic | Act04 | Prepare dinner | Kitchen, Dining room | Motion Sensor Bedroom, Sensor Kitchen Movement, |
| Dynamic | Act05 | Breakfast | Kitchen, Dining room | Motion Sensor Bedroom, Sensor Kitchen Movement, |
| Dynamic | Act06 | Lunch | Kitchen, Dining room | Motion Sensor Bedroom, Sensor Kitchen Movement, |
| Dynamic | Act07 | Dinner | Kitchen, Dining room | Motion Sensor Bedroom, Sensor Kitchen Movement, |
| Dynamic | Act08 | Eat a snack | Kitchen, Living room | Motion Sensor Bedroom, Sensor Kitchen Movement, |
| Static | Act09 | Watch TV | Living room | RemoteControl, Motion Sensor Sofa, Pressure Sofa, TV |
| Dynamic | Act10 | Enter the SmartLab | Entrance | Door |
| Static | Act11 | Play a video game | Living room | Motion Sensor Sofa, Motion Sensor Bedroom, Pressure Sofa, Remote XBOX |
| Static | Act12 | Relax on the sofa | Living room | Motion Sensor Sofa, Motion Sensor Bedroom, Pressure Sofa |
| Dynamic | Act13 | Leave the SmartLab | Entrance | Door |
| Dynamic | Act14 | Visit in the SmartLab | Entrance | Door |
| Dynamic | Act15 | Put waste in the bin | Kitchen, Entrance | Trash |
| Dynamic | Act16 | Wash hands | bathroom | Motion Sensor Bathroom, Tap, Tank |
| Dynamic | Act17 | Brush teeth | bathroom | Motion Sensor Bathroom, Tap, Tank |
| Static | Act18 | Use the toilet | bathroom | Motion Sensor Bathroom, Top WC |
| Static | Act19 | Wash dishes | Kitchen | dish, dishwasher |
| Dynamic | Act20 | Put washing into the washing machine | Bedroom, Kitchen | Laundry Basket, Washing machine, Closet |
| Static | Act21 | Work at the table | Workplace | |
| Dynamic | Act22 | Dressing | Bedroom | Wardrobe Clothes, Pyjama drawer, Laundry Basket, Closet |
| Static | Act23 | Go to the bed | Bedroom | Motion Sensor bedroom, Bed |
| Static | Act24 | Wake up | Bedroom | Motion Sensor bedroom, Bed |
Figure 3SemImputOnt: Class hierarchy with a definition axiom for the activity Breakfast.
Figure 4Classification performance of SemImput framework: Precision & Recall.
Confusion matrix for per-class HAR using non-imputed & imputed UCamI dataset.
Confusion matrix for per-class HAR using non-imputed & imputed Opportunity dataset.
Confusion matrix for per-class HAR using non-imputed & imputed UCI-ADL (OrdóñezA) dataset.
Confusion matrix for per-class HAR using non-imputed & imputed UCI-ADL (OrdóñezB) dataset.
Recognition accuracy gain using the proposed SemImput framework (Unit: %).
| Method | Datasets | Number of | (Mean Recognition Accuracy) | Standard | |
|---|---|---|---|---|---|
| Activities | Non-Imputed | Imputed | Deviation | ||
| Proposed | 17 | 86.57 | 91.71 | ±2.57 | |
| 9 | 82.27 | 89.20 | ±3.47 | ||
| 10 | 84.0 |
| ±3.17 | ||
| 24 | 71.03 |
| ±10.80 | ||
Comparison results of the proposed SemImput framework with state-of-the-art HAR Methods.
| State-of-the-Art | Datasets | Number of | Mean Recognition | SemImput |
|---|---|---|---|---|
| Methods | Activities | Accuracy(%) | Gain | |
| Razzaq et al. [ | 24 | 47.01 |
| |
| Salomón et al. [ | 24 | 90.65 |
| |
| Li et al. [ | 17 |
| −0.50 | |
| Salguero et al. [ | 9 |
| −6.58 | |
| 10 | 86.51 |
|