| Literature DB >> 33919823 |
Seungmin Oh1, Akm Ashiquzzaman1, Dongsu Lee1, Yeonggwang Kim1, Jinsul Kim1.
Abstract
In recent years, various studies have begun to use deep learning models to conduct research in the field of human activity recognition (HAR). However, there has been a severe lag in the absolute development of such models since training deep learning models require a lot of labeled data. In fields such as HAR, it is difficult to collect data and there are high costs and efforts involved in manual labeling. The existing methods rely heavily on manual data collection and proper labeling of the data, which is done by human administrators. This often results in the data gathering process often being slow and prone to human-biased labeling. To address these problems, we proposed a new solution for the existing data gathering methods by reducing the labeling tasks conducted on new data based by using the data learned through the semi-supervised active transfer learning method. This method achieved 95.9% performance while also reducing labeling compared to the random sampling or active transfer learning methods.Entities:
Keywords: active transfer learning; human activity recognition; labeling reduction; semi-supervised active transfer learning; semi-supervised learning
Mesh:
Year: 2021 PMID: 33919823 PMCID: PMC8070833 DOI: 10.3390/s21082760
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Transfer learning flow for labeling reduction.
Figure 2Active transfer learning architecture.
Figure 3Semi-supervised learning architecture.
Feature extraction with XGboost Tree-based decision model parameter.
| Parameter | Value |
|---|---|
| Booster | gbtree |
| Scale pos weight | 1 |
| Learning rate | 0.01 |
| Col-sample by tree | 0.4 |
| Subsample | 0.8 |
| N estimators | 200 |
| Max depth | 4 |
| Gamma | 10 |
List of extracted key features.
| Extracted Features | |||
|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| ||
Training/validation/testing/unlabeled separation statistics (DNN, HCI-HAR).
| Total Data | Training Data | Validation Data | Testing Data | Unlabeled Data |
|---|---|---|---|---|
| 10,299 | 500 | 1000 | 1000 | 7799 |
Figure 4(a) DNN based basic model. (b) Transferred correct classifier model.
Fully connected layer based basic model.
| DNN Based Basic Model | Transferred Correct Classifier | ||||
|---|---|---|---|---|---|
| Layers | Output Shape | Weight Freeze | Layers | Output Shape | Weight Freeze |
| FC Layer | 50, 256 | False | FC Layer | 50, 256 | True |
| ReLU | 50, 256 | False | ReLU | 50, 256 | True |
| FC Layer | 256, 128 | False | FC Layer | 256, 128 | True |
| ReLU | 256, 128 | False | ReLU | 256, 128 | True |
| Dropout | 0.2 | False | Dropout | 0.2 | False |
| FC Layer | 128, 128 | False | FC Layer | 128, 128 | False |
| ReLU | 128, 128 | False | ReLU | 128, 128 | False |
| Dropout | 0.2 | False | Dropout | 0.2 | False |
| FC Layer | 128, 6 | False | FC Layer | 128, 2 | False |
Figure 5Correct classifier training of semi-supervised active transfer learning architecture.
Figure 6Training of semi-supervised active transfer learning architecture.
Comparison of the number of queries and maximum accuracy (DNN, HCI-HAR).
| Random Sampling | Active Transfer Learning | Proposed Method | |||
|---|---|---|---|---|---|
| Number of Queries | Accuracy | Number of Queries | Accuracy | Number of Queries | Accuracy |
| 1000 | 92.9% | 224 | 95.8% | 198 | 95.5% |
Training/validation/testing/unlabeled separation statistics (CNN, HCI-HAR).
| Total Data | Training Data | Validation Data | Testing Data | Unlabeled Data |
|---|---|---|---|---|
| 10,239 | 100 | 1000 | 1000 | 8139 |
Convolution neural networks based basic model.
| DNN Based Basic Model | Transferred Correct Classifier | ||||
|---|---|---|---|---|---|
| Layers | Output Shape | Weight Freeze | Layers | Output Shape | Weight Freeze |
| 1D CNN | 5, 8, kernel_size = 5 | False | 1D CNN | 5, 8, kernel_size = 5 | True |
| ReLU | 5, 8, kernel_size = 5 | False | ReLU | 5, 8, kernel_size = 5 | True |
| 1D CNN | 8, 16, kernel_size = 5 | False | 1D CNN | 8, 16, kernel_size = 5 | True |
| ReLU | 8, 16, kernel_size = 5 | False | ReLU | 8, 16, kernel_size = 5 | True |
| 1D CNN | 16, 8, kernel_size = 5 | False | 1D CNN | 16, 8, kernel_size = 5 | True |
| ReLU | 16, 8, kernel_size = 5 | False | ReLU | 16, 8, kernel_size = 5 | True |
| Dropout | 0.5 | False | Dropout | 0.5 | False |
| MaxPooling1D | Kernel_size = 5 | False | MaxPooling1D | Kernel_size = 5 | False |
| FC Layer | 872, 100 | False | FC Layer | 872, 100 | False |
| ReLU | 872, 100 | False | ReLU | 872, 100 | False |
| FC Layer | 100, 6 | False | FC Layer | 100, 2 | False |
Figure 7(a) Accuracy graph of convolutional neural network (CNN) based models; (b) Number of semi-supervised active transfer learning (SATL) queries for CNN based models.
Training/validation/testing/unlabeled separation statistics (DNN, mHealth).
| Total Data | Training Data | Validation Data | Testing Data | Unlabeled Data |
|---|---|---|---|---|
| 16,384 | 1000 | 2000 | 2000 | 11,384 |
Comparison of the number of queries and maximum accuracy (DNN, mHealth).
| Active Transfer Learning | Proposed Method | ||
|---|---|---|---|
| Number of Queries | Accuracy | Number of Queries | Accuracy |
| 766 | 0.949% | 693 | 0.959% |