| Literature DB >> 35624595 |
Posen Lee1, Tai-Been Chen2,3, Chin-Hsuan Liu1,4, Chi-Yuan Wang2, Guan-Hua Huang3, Nan-Han Lu2,5,6.
Abstract
Many neurological and musculoskeletal disorders are associated with problems related to postural movement. Noninvasive tracking devices are used to record, analyze, measure, and detect the postural control of the body, which may indicate health problems in real time. A total of 35 young adults without any health problems were recruited for this study to participate in a walking experiment. An iso-block postural identity method was used to quantitatively analyze posture control and walking behavior. The participants who exhibited straightforward walking and skewed walking were defined as the control and experimental groups, respectively. Fusion deep learning was applied to generate dynamic joint node plots by using OpenPose-based methods, and skewness was qualitatively analyzed using convolutional neural networks. The maximum specificity and sensitivity achieved using a combination of ResNet101 and the naïve Bayes classifier were 0.84 and 0.87, respectively. The proposed approach successfully combines cell phone camera recordings, cloud storage, and fusion deep learning for posture estimation and classification.Entities:
Keywords: OpenPose; fusion deep learning; iso-block postural identity
Mesh:
Year: 2022 PMID: 35624595 PMCID: PMC9139042 DOI: 10.3390/bios12050295
Source DB: PubMed Journal: Biosensors (Basel) ISSN: 2079-6374
Figure 1Flow of research.
Figure 2Experimental setup (the cell phone was placed 1 m above the floor and 2 m from the participant).
Information on the number of participants and the mean and standard deviation (STD) of velocity (m/s) and time (s) for each group.
| Group | N | Mean Velocity (m/s) | STD Velocity (m/s) | Mean Time (s) | STD Time (s) |
|---|---|---|---|---|---|
| Skew | 102 | 0.68 | 0.08 | 7.48 | 0.84 |
| Straight | 108 | 0.69 | 0.08 | 7.39 | 0.91 |
Figure 3Dynamic joint node plot (DJNP) (right) obtained by merging the heat maps of joint nodes from t1 to t5 by using the OpenPose algorithm.
Information on the adopted convolutional neural networks.
| CNN | Image Size | Layers | Parametric Size (MB) | Layer of Features |
|---|---|---|---|---|
| AlexNet | 227 × 227 | 25 | 227 | 17th (4096 × 9216) |
| DenseNet201 | 224 × 224 | 709 | 77 | 706th (1000 × 1920) |
| GoogleNet | 224 × 224 | 144 | 27 | 142nd (1000 × 1024) |
| MobileNetV2 | 224 × 224 | 154 | 13 | 152nd (1000 × 1280) |
| ResNet101 | 224 × 224 | 347 | 167 | 345th (1000 × 2048) |
| ResNet50 | 224 × 224 | 177 | 96 | 175th (1000 × 2048) |
| VGG16 | 224 × 224 | 41 | 27 | 33rd (4096 × 25,088) |
| VGG19 | 224 × 224 | 47 | 535 | 39th (4096 × 25,088) |
Figure 4Scatter plot for the specificity and sensitivity of the 1920 models for the validation dataset.
Figure 5Radar plot of the six performance indices sorted in the ascending order of the kappa value for 96 models (the abbreviations are explained in the Appendix A). Sen represents the sensitivity, and Spe represents the specificity.
Models with kappa values greater than 0.59.
| CNN | Classifier | Batch Size | Model | Kappa | Accuracy | Sen | Spe | PPV | NPV |
|---|---|---|---|---|---|---|---|---|---|
| ResNet101 | NB | 5 | M53 | 0.71 | 0.86 | 0.87 | 0.84 | 0.84 | 0.87 |
| AlexNet | NB | 11 | M7 | 0.65 | 0.83 | 0.81 | 0.84 | 0.83 | 0.82 |
| ResNet101 | NB | 14 | M56 | 0.65 | 0.83 | 0.81 | 0.84 | 0.83 | 0.82 |
| AlexNet | NB | 5 | M5 | 0.62 | 0.81 | 0.77 | 0.84 | 0.83 | 0.79 |
| VGG16 | NB | 14 | M80 | 0.62 | 0.81 | 0.77 | 0.84 | 0.83 | 0.79 |
| DenseNet201 | SVM | 11 | M23 | 0.62 | 0.81 | 0.68 | 0.94 | 0.91 | 0.75 |
| ResNet101 | NB | 8 | M54 | 0.59 | 0.79 | 0.90 | 0.69 | 0.74 | 0.88 |
| VGG19 | NB | 11 | M91 | 0.59 | 0.79 | 0.84 | 0.75 | 0.77 | 0.83 |
| AlexNet | NB | 14 | M8 | 0.59 | 0.79 | 0.81 | 0.78 | 0.78 | 0.81 |
| DenseNet201 | SVM | 5 | M21 | 0.59 | 0.79 | 0.74 | 0.84 | 0.82 | 0.77 |
| DenseNet201 | SVM | 14 | M24 | 0.59 | 0.79 | 0.77 | 0.81 | 0.80 | 0.79 |
| VGG16 | NB | 8 | M78 | 0.59 | 0.79 | 0.77 | 0.81 | 0.80 | 0.79 |
| AlexNet | NB | 8 | M6 | 0.59 | 0.79 | 0.71 | 0.88 | 0.85 | 0.76 |
Figure 6Iso-block postural identity (IPI) generated for a series of times and fusion of the IPI with a DJNP (right).
Figure 7Graphical representation of the skewness or displacement for a walking video at three time points (i.e., t). (A,D), (B,E), and (C,F), respectively, present postural skew to the left, postural balance, and postural skew to the right with participants walking toward the camera.
The 96 combinations of investigated models with abbreviation are listed below.
| CNN | Classifier | Batch Size | Model | CNN | Classifier | Batch Size | Model | CNN | Classifier | Batch Size | Model |
|---|---|---|---|---|---|---|---|---|---|---|---|
| AlexNet | LR | 5 | M1 | GoogleNet | SVM | 5 | M33 | ResNet50 | NB | 5 | M65 |
| AlexNet | LR | 8 | M2 | GoogleNet | SVM | 8 | M34 | ResNet50 | NB | 8 | M66 |
| AlexNet | LR | 11 | M3 | GoogleNet | SVM | 11 | M35 | ResNet50 | NB | 11 | M67 |
| AlexNet | LR | 14 | M4 | GoogleNet | SVM | 14 | M36 | ResNet50 | NB | 14 | M68 |
| AlexNet | NB | 5 | M5 | MobileNetV2 | LR | 5 | M37 | ResNet50 | SVM | 5 | M69 |
| AlexNet | NB | 8 | M6 | MobileNetV2 | LR | 8 | M38 | ResNet50 | SVM | 8 | M70 |
| AlexNet | NB | 11 | M7 | MobileNetV2 | LR | 11 | M39 | ResNet50 | SVM | 11 | M71 |
| AlexNet | NB | 14 | M8 | MobileNetV2 | LR | 14 | M40 | ResNet50 | SVM | 14 | M72 |
| AlexNet | SVM | 5 | M9 | MobileNetV2 | NB | 5 | M41 | VGG16 | LR | 5 | M73 |
| AlexNet | SVM | 8 | M10 | MobileNetV2 | NB | 8 | M42 | VGG16 | LR | 8 | M74 |
| AlexNet | SVM | 11 | M11 | MobileNetV2 | NB | 11 | M43 | VGG16 | LR | 11 | M75 |
| AlexNet | SVM | 14 | M12 | MobileNetV2 | NB | 14 | M44 | VGG16 | LR | 14 | M76 |
| DenseNet201 | LR | 5 | M13 | MobileNetV2 | SVM | 5 | M45 | VGG16 | NB | 5 | M77 |
| DenseNet201 | LR | 8 | M14 | MobileNetV2 | SVM | 8 | M46 | VGG16 | NB | 8 | M78 |
| DenseNet201 | LR | 11 | M15 | MobileNetV2 | SVM | 11 | M47 | VGG16 | NB | 11 | M79 |
| DenseNet201 | LR | 14 | M16 | MobileNetV2 | SVM | 14 | M48 | VGG16 | NB | 14 | M80 |
| DenseNet201 | NB | 5 | M17 | ResNet101 | LR | 5 | M49 | VGG16 | SVM | 5 | M81 |
| DenseNet201 | NB | 8 | M18 | ResNet101 | LR | 8 | M50 | VGG16 | SVM | 8 | M82 |
| DenseNet201 | NB | 11 | M19 | ResNet101 | LR | 11 | M51 | VGG16 | SVM | 11 | M83 |
| DenseNet201 | NB | 14 | M20 | ResNet101 | LR | 14 | M52 | VGG16 | SVM | 14 | M84 |
| DenseNet201 | SVM | 5 | M21 |
|
|
|
| VGG19 | LR | 5 | M85 |
| DenseNet201 | SVM | 8 | M22 | ResNet101 | NB | 8 | M54 | VGG19 | LR | 8 | M86 |
| DenseNet201 | SVM | 11 | M23 | ResNet101 | NB | 11 | M55 | VGG19 | LR | 11 | M87 |
| DenseNet201 | SVM | 14 | M24 | ResNet101 | NB | 14 | M56 | VGG19 | LR | 14 | M88 |
| GoogleNet | LR | 5 | M25 | ResNet101 | SVM | 5 | M57 | VGG19 | NB | 5 | M89 |
| GoogleNet | LR | 8 | M26 | ResNet101 | SVM | 8 | M58 | VGG19 | NB | 8 | M90 |
| GoogleNet | LR | 11 | M27 | ResNet101 | SVM | 11 | M59 | VGG19 | NB | 11 | M91 |
| GoogleNet | LR | 14 | M28 | ResNet101 | SVM | 14 | M60 | VGG19 | NB | 14 | M92 |
| GoogleNet | NB | 5 | M29 | ResNet50 | LR | 5 | M61 | VGG19 | SVM | 5 | M93 |
| GoogleNet | NB | 8 | M30 | ResNet50 | LR | 8 | M62 | VGG19 | SVM | 8 | M94 |
| GoogleNet | NB | 11 | M31 | ResNet50 | LR | 11 | M63 | VGG19 | SVM | 11 | M95 |
| GoogleNet | NB | 14 | M32 | ResNet50 | LR | 14 | M64 | VGG19 | SVM | 14 | M96 |