| Literature DB >> 29673171 |
Ziyang He1, Xiaoqing Zhang2, Yangjie Cao3,4, Zhi Liu5, Bo Zhang6, Xiaoyan Wang7.
Abstract
By running applications and services closer to the user, edge processing provides many advantages, such as short response time and reduced network traffic. Deep-learning based algorithms provide significantly better performances than traditional algorithms in many fields but demand more resources, such as higher computational power and more memory. Hence, designing deep learning algorithms that are more suitable for resource-constrained mobile devices is vital. In this paper, we build a lightweight neural network, termed LiteNet which uses a deep learning algorithm design to diagnose arrhythmias, as an example to show how we design deep learning schemes for resource-constrained mobile devices. Compare to other deep learning models with an equivalent accuracy, LiteNet has several advantages. It requires less memory, incurs lower computational cost, and is more feasible for deployment on resource-constrained mobile devices. It can be trained faster than other neural network algorithms and requires less communication across different processing units during distributed training. It uses filters of heterogeneous size in a convolutional layer, which contributes to the generation of various feature maps. The algorithm was tested using the MIT-BIH electrocardiogram (ECG) arrhythmia database; the results showed that LiteNet outperforms comparable schemes in diagnosing arrhythmias, and in its feasibility for use at the mobile devices.Entities:
Keywords: deep learning algorithms; electrocardiogram; lightweight neural network; resource-constrained mobile devices
Mesh:
Year: 2018 PMID: 29673171 PMCID: PMC5948502 DOI: 10.3390/s18041229
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1(a) Inception module; (b) Fire module; (c) Depthwise separable convolution that factorizes a standard convolution into a depthwise convolution and a pointwise convolution.
Figure 2System illustration of arrhythmia detection based on ECG using LiteNet.
Figure 3One-dimensional convolution process.
Figure 4Lite module.
Figure 5LiteNet Architecture: (a) basic LiteNet architecture; (b) extended LiteNet architecture.
Summary of the basic LiteNet model for this work.
| Layer | Kernel Size | Stride | No. of Filters | |
|---|---|---|---|---|
| Standard Conv. | 1 × 5 | 1 | 5 | |
| Max-Pooling | 1 × 2 | 2 | 5 | |
| Lite Module | Squeeze Conv. | 1 × 1 | 1 | 3 |
| Standard Conv. | 1 × 1 | 1 | 6 | |
| 1 × 2 | 1 | 6 | ||
| 1 × 3 | 1 | 6 | ||
| Depthwise Conv. | 1 × 2 | 1 | 6 | |
| 1 × 3 | 1 | 6 | ||
| Pointwise Conv. | 1 × 1 | 1 | 6 | |
| 1 × 1 | 1 | 6 | ||
| Max-Pooling | 1 × 2 | 2 | 18 | |
| Dense | 30 | |||
| Dense | 20 | |||
Figure 6Ten-fold cross-validation.
Confusion matrix principle.
| True Label | |||
|---|---|---|---|
| Normal | Arrhythmia | ||
| Predicate label | normal | True Positive (TP) | False Positive (FP) |
| arrhythmia | False Negative (FN) | Ture Negative (TN) | |
Classification evaluation and corresponding AUC values.
| Evaluation | Excellent | Good | Fair | Poor | Failure |
|---|---|---|---|---|---|
| AUC range | 0.9–1.0 | 0.8–0.9 | 0.7–0.8 | 0.6–0.7 | 0.5–0.6 |
Comparison of Adam and SGD using LiteNet for set A with the ten-fold cross-validation method.
| Optimizer | ACC (%) | AUC (%) | F1-Measure (%) |
|---|---|---|---|
| SGD | 95.66 | 96.67 | 98.65 |
| Adam | 97.87 | 97.78 | 99.33 |
Comparison of Adam and SGD using LiteNet for set B with the ten-fold cross-validation method.
| Optimizer | ACC (%) | AUC (%) | F1-Measure (%) |
|---|---|---|---|
| SGD | 96.67 | 97.55. | 98.34 |
| Adam | 98.80 | 99.30 | 99.66 |
Comparison of four CNN-based networks for set A with the ten-fold cross-validation method.
| Network | PC | ACC (%) | AUC (%) | F1-Measure (%) |
|---|---|---|---|---|
| AlexNet | 1100 | 97.89 | 98.48 | 99.35 |
| GoogleNet | 1276 | 98.34 | 98.79 | 99.58 |
| SqueezeNet | 528 | 97.53 | 97.28 | 99.09 |
| MobileNets | 550 | 97.45 | 97.34 | 99.12 |
| LiteNet | 454 | 97.87 | 97.78 | 99.33 |
Comparison of four CNN-based networks for set B with the ten-fold cross-validation method.
| Network | PC | ACC (%) | AUC (%) | F1-Measure (%) |
|---|---|---|---|---|
| AlexNet | 1100 | 98.83 | 99.05 | 99.68 |
| GoogleNet | 1276 | 99.01 | 99.24 | 99.53 |
| SqueezeNet | 528 | 98.29 | 98.92 | 99.41 |
| MobileNets | 550 | 98.20 | 98.82 | 99.28 |
| LiteNet | 454 | 98.80 | 99.30 | 99.66 |
Figure 7Training time of five models.
Figure 8Testing time of five models.
Figure 9The trade-off between accuracy and testing time on five models. (a,b) represent dataset A and dataset B, respectively.
Figure 10Confusion matrixs for LiteNet on set A (a) and set B (b).