| Literature DB >> 27399701 |
Karina Lebel1,2,3, Patrick Boissy4,5,6, Hung Nguyen7,8, Christian Duval9,10.
Abstract
Clinical mobility assessment is traditionally performed in laboratories using complex and expensive equipment. The low accessibility to such equipment, combined with the emerging trend to assess mobility in a free-living environment, creates a need for body-worn sensors (e.g., inertial measurement units-IMUs) that are capable of measuring the complexity in motor performance using meaningful measurements, such as joint orientation. However, accuracy of joint orientation estimates using IMUs may be affected by environment, the joint tracked, type of motion performed and velocity. This study investigates a quality control (QC) process to assess the quality of orientation data based on features extracted from the raw inertial sensors' signals. Joint orientation (trunk, hip, knee, ankle) of twenty participants was acquired by an optical motion capture system and IMUs during a variety of tasks (sit, sit-to-stand transition, walking, turning) performed under varying conditions (speed, environment). An artificial neural network was used to classify good and bad sequences of joint orientation with a sensitivity and a specificity above 83%. This study confirms the possibility to perform QC on IMU joint orientation data based on raw signal features. This innovative QC approach may be of particular interest in a big data context, such as for remote-monitoring of patients' mobility.Entities:
Keywords: 3D orientation tracking; AHRS; IMU; MARG; MIMU; artificial neural network; attitude and heading reference system; inertial motion capture; inertial sensors; joint orientation; quality control
Mesh:
Year: 2016 PMID: 27399701 PMCID: PMC4970086 DOI: 10.3390/s16071037
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Joint orientation estimates quality control process overview.
QC Input Features.
| Features Category | Aspect | Input Feature |
|---|---|---|
| Environment | Deviation of the mean magnetic field from a reference value, for each module (inputs 1–2) Variance of the magnetic field signal around each module (inputs 3–4) | |
| Motion performed | Mean acceleration per module (inputs 5–6) Mean angular velocity per module (inputs 7–8) | |
| Direction of motion | Proportion of angular velocity measured on each axis (inputs 9–11 for module 1, inputs 12–14 for module 2) | |
| Environment | Difference between the two modules’ magnetic field (input 15) | |
Difference between the two modules’ magnetic field in the previous sequence (input 16) |
Figure 2Artificial neural network architecture. The 16 inputs correspond to the features based on raw inertial signals defined in Table 1. Features are then processed at the hidden layer level composed of six neurons. Weights and bias are attributed for all inputs during the training process and neuron are activated following a symmetric sigmoid function. The resulting activation patterns are then again adjusted (output layer’s weights and bias) and summed up to determine the final classification.
Figure 3Setup and protocol. Joint accuracy validation is accomplished by comparison of the orientation data estimated by the AHRS with those obtained from an optical motion capture gold standard, VICON. (A) A subset of the AHRS is solidly affixed to a rigid body created with a minimum of four optical markers; (B) The assembled bundles are then placed on the body segments targeted for evaluation, namely the head, the upper trunk, the pelvis, and the left lower limb (thigh, shank, foot); (C) Twenty participants with a variety of anthropometric characteristics participated in this study, ensuring diverse conditions of realization of the tasks; (D) Participants were asked to perform a 5 m standardized timed-up and go (TUG), a recognized clinical test including a number of basic mobility tasks. Tests were performed along two different paths and at different velocities.
Figure 4Data processing overall workflow using an ANN for joint orientation estimates quality control. (A) The 20 participants enrolled in the study were first divided into two groups, the first 10 being dedicated to the training of the QC algorithm while the other 10 allowed for validation of the algorithm; (B) Trials performed by the participants belonging to the training group were segmented into low-level tasks (sitting, sit-to-stand transfer, walking, turning, and turn-to-sit). For each data segment, a set of features based on the IMU raw signals were extracted. The ANN was then trained as long as satisfactory results regarding sensitivity and specificity are achieved and the resulting ANN becomes the so-called QC algorithm; (C) The performance of the QC algorithm is then verified using trials from another set of 10 participants (i.e., validation group).
Figure 5Effect of data quality control using a neural network approach on (A) quality of sequence distribution and (B) joint orientation accuracy for a diversity of tasks.
Impact of autonomous quality control of orientation data sequences per task and joint.
| Original Data | Cleaned-up Data (ANN) | |||||
|---|---|---|---|---|---|---|
| Ntotal | Ngood | RMSDtotal | Naccepted | RMSDaccepted | ||
|
|
| 180 | 177 | 0.9° (1.4°) | 177 | 0.7° (0.6°) |
|
| 180 | 177 | 0.7° (2.5°) | 178 | 0.5° (0.6°) | |
|
| 180 | 177 | 0.9° (2.8°) | 178 | 0.7° (1.0°) | |
|
| 177 | 159 | 3.4° (10.1°) | 157 | 1.2° (2.2°) | |
|
|
| 175 | 120 | 4.2° (2.5°) | 113 | 3.6° (2.0°) |
|
| 175 | 165 | 2.1° (1.9°) | 177 | 2.1° (1.9°) | |
|
| 180 | 167 | 2.4° (2.3°) | 169 | 2.2° (1.7°) | |
|
| 176 | 122 | 5.8° (9.4°) | 124 | 3.8° (3.7°) | |
|
|
| 349 | 266 | 4.3° (4.5°) | 325 | 4.2° (4.0°) |
|
| 349 | 301 | 3.5° (4.0°) | 331 | 3.3° (3.3°) | |
|
| 359 | 260 | 4.3° (2.5°) | 275 | 4.2° (2.4°) | |
|
| 351 | 51 | 14.3° (12.4°) | 7 | 5.1° (2.4°) | |
|
|
| 166 | 117 | 4.1° (2.5°) | 141 | 4.0° (2.3°) |
|
| 166 | 133 | 3.8° (1.8°) | 161 | 3.8° (1.9°) | |
|
| 180 | 108 | 4.7° (2.6°) | 81 | 4.5° (2.3°) | |
|
| 176 | 18 | 15.2° (10.9°) | 2 | 6.4° (0.2°) | |
|
|
| 168 | 118 | 4.4° (3.5°) | 157 | 4.5° (3.6°) |
|
| 168 | 126 | 3.9° (2.0°) | 161 | 3.8° (1.9°) | |
|
| 165 | 108 | 5.6° (5.7°) | 62 | 4.1° (2.5°) | |
|
| 159 | 53 | 12.5° (15.2°) | 26 | 6.0° (4.5°) | |
Note: Ntotal—number of data; Ngood—number of good data (i.e., RMSD ≤ 5°); RMSDtotal—Mean (std dev.) root-mean-square difference between AHRS joint orientation and reference joint orientation (all data); Nkept—number of data classified as good by the ANN; RMSDaccepted—Mean (std dev.) root-mean-square difference between AHRS joint orientation accepted by the ANN and their reference joint orientation.