| Literature DB >> 35408420 |
Renaud Hage1,2,3, Fabien Buisseret1,4, Martin Houry5, Frédéric Dierick1,3,6.
Abstract
Understanding neck pain is an important societal issue. Kinematic data from sensors may help to gain insight into the pathophysiological mechanisms associated with neck pain through a quantitative sensorimotor assessment of one patient. The objective of this study was to evaluate the potential usefulness of artificial intelligence with several machine learning (ML) algorithms in assessing neck sensorimotor performance. Angular velocity and acceleration measured by an inertial sensor placed on the forehead during the DidRen laser test in thirty-eight acute and subacute non-specific neck pain (ANSP) patients were compared to forty-two healthy control participants (HCP). Seven supervised ML algorithms were chosen for the predictions. The most informative kinematic features were computed using Sequential Feature Selection methods. The best performing algorithm is the Linear Support Vector Machine with an accuracy of 82% and Area Under Curve of 84%. The best discriminative kinematic feature between ANSP patients and HCP is the first quartile of head pitch angular velocity. This study has shown that supervised ML algorithms could be used to classify ANSP patients and identify discriminatory kinematic features potentially useful for clinicians in the assessment and monitoring of the neck sensorimotor performance in ANSP patients.Entities:
Keywords: artificial intelligence; head rotation test; kinematics; neck pain; supervised machine learning
Mesh:
Year: 2022 PMID: 35408420 PMCID: PMC9002899 DOI: 10.3390/s22072805
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Characteristics of the acute and subacute non-specific neck pain (ANSP) patients and healthy control participants (HCP). p-values resulted from t-test for age and BMI, Mann–Whitney U-test for NDI and NPRS, and Chi-2 for gender.
| ANSP Patients ( | HCP ( | ||
|---|---|---|---|
| Age (years), mean ± SD | 46.2 ± 16.3 | 24.3 ± 6.8 | <0.001 |
| Gender | 21 (55%)/17 (45%) | 27 (64%)/15 (36%) | 0.55 |
| BMI (kg m−2), mean ± SD | 23.5 ± 3.2 | 21.5 ± 4.2 | 0.014 |
| NDI (100), median [Q1–Q3] | 22 [16–31.5] | 0 [0–0] | <0.001 |
| NPRS, median [Q1–Q3] | 6 [4–7] | 0 [0–0] | <0.001 |
BMI: body mass index, NDI: neck disability index, NPRS: numeric pain rating scale.
Figure 1Description of the DidRen laser test. (A) Rear view of head position in front of the targets. (B) Schematic top view of the experimental setup with the three photosensitive sensors. The reference frame of the sensor is displayed when the head is in rest position. Coordinate system used in the study is also shown with the yaw (X-axis), pitch (Y-axis), and roll (Z-axis) rotations of the head during the test. (C) Helmet worn by an HCP (here RH) with laser on the top of the head and DYSKIMOT inertial sensor on the forehead.
Optimal hyperparameter values: Number of neighbors (n_neighbors), Regularization parameter (C-parameter), Kernel coefficient (gamma), maximum depth of the tree (max_depth), number of trees in the forest (n_estimators), and number of features to consider when looking for the best split (max_features).
| ML Algorithm | Hyperparameters |
|---|---|
| BF KNN | n_neighbors = 5, weights = “distance” |
| Linear SVM | kernel = “linear”, C = 10 |
| SVM RBF | gamma = 0.001, C = 100 |
| DT | max_depth = 1, criterion = “entropy”, splitter = “best” |
| RF | max_depth = 10, n_estimators = 100, max_features = 10 |
BF KNN: Brute-Force K-Nearest Neighbors, SVM: Support Vector Machine, RBF: radial basis function, DT: Decision Tree, RF: Random Forest.
Performance metrics of the selected ML algorithms.
| ML Algorithm | Accuracy | AUC Score |
|---|---|---|
| BF KNN | 0.66 ± 0.03 | 0.51 ± 0.07 |
| Linear SVM | 0.82 ± 0.03 | 0.84 ± 0.04 |
| SVM RBF | 0.65 ± 0.05 | 0.57 ± 0.09 |
| DT | 0.74 ± 0.03 | 0.70 ± 0.04 |
| RF | 0.76 ± 0.03 | 0.76 ± 0.04 |
| AdaBoost | 0.75 ± 0.04 | 0.76 ± 0.05 |
| GaussianNB | 0.77 ± 0.03 | 0.82 ± 0.03 |
BF KNN: Brute-Force K-Nearest Neighbors, SVM: Support Vector Machine, RBF: radial basis function, DT: Decision Tree, RF: Random Forest, AdaBoost: Adaptive Boosting, GaussianNB: Gaussian Naive Bayes, AUC: area under curve.
Figure 2Receiver Operating Characteristic (ROC) curve of Linear SVM (in blue). The dotted red line represents the worst possible scenario, a random classifier.