| Literature DB >> 35746424 |
Chen Ye1,2, Yuhao Xiao1, Ruoyu Li3, Hongkai Gu3, Xinyu Wang1, Tianyang Lu1, Lingjing Jin3,4.
Abstract
Abnormal movement of the head and neck is a typical symptom of Cervical Dystonia (CD). Accurate scoring on the severity scale is of great significance for treatment planning. The traditional scoring method is to use a protractor or contact sensors to calculate the angle of the movement, but this method is time-consuming, and it will interfere with the movement of the patient. In the recent outbreak of the coronavirus disease, the need for remote diagnosis and treatment of CD has become extremely urgent for clinical practice. To solve these problems, we propose a multi-view vision based CD severity scale scoring method, which detects the keypoint positions of the patient from the frontal and lateral images, and finally scores the severity scale by calculating head and neck motion angles. We compared the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS) subscale scores calculated by our vision based method with the scores calculated by a neurologist trained in dyskinesia. An analysis of the correlation coefficient was then conducted. Intra-class correlation (ICC)(3,1) was used to measure absolute accuracy. Our multi-view vision based CD severity scale scoring method demonstrated sufficient validity and reliability. This low-cost and contactless method provides a new potential tool for remote diagnosis and treatment of CD.Entities:
Keywords: Azure Kinect; Cervical Dystonia; human motion analysis; human pose estimation; remote diagnosis
Mesh:
Year: 2022 PMID: 35746424 PMCID: PMC9230118 DOI: 10.3390/s22124642
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1The diagram of abnormal movement patterns of CD.
Figure 2Head positioning cap.
Figure 3Devices Diagram for the multi-view vision based method.
Figure 4The multi-view vision based method.
Figure 5The human keypoints of the subject.
Figure 6The scheme of angle calculation.
Figure 7The scheme of IMU based method.
Characteristics of subjects.
| Subject | Age (Years) | Sex | TWSTRS Subscales | ||
|---|---|---|---|---|---|
| Rotation | Laterocollis | Antecollis/Retrocollis | |||
| 1 | 26 | Female | Mild | Moderate | Moderate |
| 2 | 48 | Female | Severe | None | None |
| 3 | 45 | Female | Mild | Mild | None |
| 4 | 34 | Female | Slight | Moderate | None |
| 5 | 52 | Female | Slight | Moderate | Mild |
| 6 | 37 | male | Slight | Moderate | Mild |
| 7 | 46 | male | Slight | Mild | Severe |
| 8 | 42 | male | Mild | Moderate | Moderate |
Score of the subscales of the TWSTRS severity scale calculated by the multi-view vision based method, a neurologist trained in movement disorders, and the wearable inertial sensors based method.
| Patient | Rotation * | Laterocollis * | Antecollis/Retrocollis ** | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| RA | N | M | W | RA | N | M | W | RA | N | M | W | F | |
| 1 | 31.67 | 2 | 2 | 2 | −34.44 | 2 | 2 | 3 | 23.20 | 2 | 2 | 1 | 1 |
| 2 | 48.10 | 4 | 3 | 2 | 2.30 | 0 | 0 | 1 | 26.57 | 0 | 2 | 1 | 0 |
| 3 | −23.09 | 2 | 2 | 2 | −8.16 | 1 | 1 | 1 | 18.43 | 0 | 1 | 1 | 1 |
| 4 | 17.85 | 1 | 1 | 1 | −11.37 | 2 | 1 | 0 | 3.94 | 0 | 0 | 0 | 1 |
| 5 | 6.57 | 1 | 1 | 1 | −13.86 | 2 | 1 | 1 | 20.56 | 1 | 1 | 0 | 0 |
| 6 | 9.76 | 1 | 1 | 2 | −13.20 | 2 | 1 | 2 | 8.13 | 1 | 1 | 0 | 1 |
| 7 | −17.04 | 1 | 1 | 0 | −7.85 | 1 | 1 | 1 | −90.00 | 3 | 3 | 2 | 2 |
| 8 | 15.56 | 2 | 1 | 1 | −28.87 | 2 | 2 | 1 | −45.00 | 2 | 2 | 1 | 1 |
RA: raw angle by the multi-view vision based method, N: movement disorder-trained neurologist, M: multi-view vision based method, W: wearable IMU based method, F: single-view based method only using the frontal image. * Negative value represent right. ** Negative value represent posterior.
Validity and accuracy of our method and the previous work.
| Items | Correlation | ICC(3,1) | |
|---|---|---|---|
| Rotation | Nakamura’s | 0.902 * | 0.793 * |
| Ours | 0.843 * | 0.870 * | |
| Laterocollis | Nakamura’s | 0.369 * | 0.330 * |
| Ours | 0.667 | 0.727 * | |
| Antecollis/retrocollis | Nakamura’s | 0.181 | 0.281 |
| Ours | 0.701 | 0.739 * |
* p < 0.05.
Validity and accuracy of the multi-view based method and the wearable IMU based method.
| Items | Correlation | ICC(3,1) | |
|---|---|---|---|
| Rotation | M | 0.843 * | 0.870 * |
| W | 0.564 | 0.484 | |
| Laterocollis | M | 0.667 | 0.727 * |
| W | 0.189 | 0.211 | |
| Antecollis/retrocollis | M | 0.701 | 0.739 * |
| W | 0.474 | 0.525 |
M: multi-view vision based method, W: wearable IMU based method. * p < 0.05.
Validity and accuracy of the multi-view vision based method and the single-view vision based method.
| Items | Correlation | ICC(3,1) | |
|---|---|---|---|
| Antecollis/retrocollis | M | 0.701 | 0.739 * |
| F | 0.550 | 0.532 |
M: multi-view vision based method, F: single-view vision based method only using the frontal image. * p < 0.05.
The costs of implementation and maintenance of the methods.
| Method | Device | Price (USD) | Total Price (USD) |
|---|---|---|---|
| Vision based method | Azure Kinect | 399 | 1518 |
| HP 320 FHD Webcam | 29 | ||
| Computer | 1090 | ||
| Manual Measurement | Professional protractor | 100 | 100 |
| IMU based method | LPMS-B2 | 180 × 3 | 1589 |
| Computer | 990 | ||
| Head position cap | 59 |