| Literature DB >> 35632266 |
Robbin Romijnders1, Elke Warmerdam2, Clint Hansen1, Gerhard Schmidt3, Walter Maetzler1.
Abstract
Many algorithms use 3D accelerometer and/or gyroscope data from inertial measurement unit (IMU) sensors to detect gait events (i.e., initial and final foot contact). However, these algorithms often require knowledge about sensor orientation and use empirically derived thresholds. As alignment cannot always be controlled for in ambulatory assessments, methods are needed that require little knowledge on sensor location and orientation, e.g., a convolutional neural network-based deep learning model. Therefore, 157 participants from healthy and neurologically diseased cohorts walked 5 m distances at slow, preferred, and fast walking speed, while data were collected from IMUs on the left and right ankle and shank. Gait events were detected and stride parameters were extracted using a deep learning model and an optoelectronic motion capture (OMC) system for reference. The deep learning model consisted of convolutional layers using dilated convolutions, followed by two independent fully connected layers to predict whether a time step corresponded to the event of initial contact (IC) or final contact (FC), respectively. Results showed a high detection rate for both initial and final contacts across sensor locations (recall ≥92%, precision ≥97%). Time agreement was excellent as witnessed from the median time error (0.005 s) and corresponding inter-quartile range (0.020 s). The extracted stride-specific parameters were in good agreement with parameters derived from the OMC system (maximum mean difference 0.003 s and corresponding maximum limits of agreement (-0.049 s, 0.051 s) for a 95% confidence level). Thus, the deep learning approach was considered a valid approach for detecting gait events and extracting stride-specific parameters with little knowledge on exact IMU location and orientation in conditions with and without walking pathologies due to neurological diseases.Entities:
Keywords: deep learning; gait; gait events; inertial measurement unit
Mesh:
Year: 2022 PMID: 35632266 PMCID: PMC9143761 DOI: 10.3390/s22103859
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Demographics data of the study participants. Age, height, and weight are presented as mean (standard deviation).
| Group | Gender | Number of | Age | Height | Weight |
|---|---|---|---|---|---|
| YA | F | 21 | 27 (7) | 173 (5) | 67 (9) |
| M | 21 | 29 (9) | 185 (8) | 80 (12) | |
| OA | F | 12 | 70 (6) | 167 (6) | 72 (17) |
| M | 10 | 73 (6) | 180 (6) | 83 (12) | |
| PD | F | 12 | 67 (6) | 168 (7) | 70 (15) |
| M | 19 | 61 (11) | 178 (7) | 86 (14) | |
| MS | F | 12 | 37 (10) | 174 (9) | 75 (9) |
| M | 9 | 42 (16) | 189 (9) | 96 (32) | |
| stroke | F | 4 | 66 (11) | 160 (7) | 65 (13) |
| M | 17 | 67 (18) | 178 (7) | 84 (15) | |
| cLBP | F | 3 | 64 (12) | 166 (6) | 65 (6) |
| M | 6 | 66 (17) | 177 (8) | 86 (14) | |
| other | F | 3 | 60 (16) | 166 (4) | 79 (19) |
| M | 8 | 68 (19) | 182 (7) | 85 (14) |
Group: YA: younger adults, OA: older adults, PD: Parkinson’s Disease, MS: multiple sclerosis, cLBP: chronic low back pain; Gender: F: female, M: male.
Figure 1Schematic depiction (Picture from: https://www.vecteezy.com/free-vector/man-walking, accessed on 11 November 2021) of the current study. Study participants wore IMUs on the ankle and shanks, and reflective markers were adhered on the heel and toe of usual footwear (illustrated on the left). Marker data were used to obtain reference values for the timings of initial and final contacts (top), where accelerometer and gyroscope data from each tracked point were inputted to a neural network that predicted timings of the same initial and final contacts (bottom).
Overview of the total number of participants, walking trials, and number of instances in the training, validation, and test set. A detailed overview with exactly for which trial and sensor location valid data were available can be found at https://github.com/rmndrs89/my-gait-events-tcn, accessed on 1 April 2022.
| Dataset | No. of Participants | No. of Trials | No. of Instances |
|---|---|---|---|
| Train | 61 | 749 | 3366 |
| Validation | 48 | 564 | 2570 |
| Test | 48 | 620 | 620 |
Figure 2The generic model architecture of the deep learning model to predict initial contacts (ICs) and final contacts (FCs). The inputs are the accelerometer and gyroscope data from a single inertial measurement unit, which are fed to a temporal convolutional network (TCN) (left). The TCN consisted of repeating residual blocks (ResBlocks) with exponentially increasing dilation factor (middle). Each ResBlock was built from two sequences of a convolutional layer (Conv), batch normalization layer (BatchNorm), a rectified linear unit activation layer (ReLU), and a dropout layer (DropOut) (right).
Model hyperparameters that were optimized for, and the corresponding sets of possible values.
| Description | Possible Values |
|---|---|
| Number of filters | 8, |
| Kernel size | 3, |
| Dilations | [1, 2], |
The hyperparameter values that were selected for the trained model to make predictions on the test set are shown in bold.
Model layer hyperparameters.
| Layer # | Layer Type | Hyperparameters | Output Shape |
|---|---|---|---|
| 0 | inputs | batch size × 400 × 6 | |
| 1a | conv | no. of filters: 16 | batch size × 400 × 16 |
| kernel size: 5 | |||
| stride: 1 | |||
| padding: same | |||
| dilation: 1 | |||
| 1b | conv | no. of filters: 16 | batch size × 400 × 16 |
| kernel size: 1 | |||
| stride: 1 | |||
| padding: same | |||
| dilation: 1 | |||
| 2 | conv | no. of filters: 16 | batch size × 400 × 16 |
| kernel size: 5 | |||
| stride: 1 | |||
| padding: same | |||
| dilation: 1 | |||
| 3 | conv | no. of filters: 16 | batch size × 400 × 16 |
| kernel size: 5 | |||
| stride: 1 | |||
| padding: same | |||
| dilation: 2 | |||
| 4 | conv | no. of filters: 16 | batch size × 400 × 16 |
| kernel size: 5 | |||
| stride: 1 | |||
| padding: same | |||
| dilation: 2 | |||
| 5 | conv | no. of filters: 16 | batch size × 400 × 16 |
| kernel size: 5 | |||
| stride: 1 | |||
| padding: same | |||
| dilation: 4 | |||
| 6 | conv | no. of filters: 16 | batch size × 400 × 16 |
| kernel size: 5 | |||
| stride: 1 | |||
| padding: same | |||
| dilation: 4 | |||
| 7a | dense | no. of units: 1 | batch size × 400 × 1 |
| 7b | dense | no. of units: 1 | batch size × 400 × 1 |
conv: convolutional layer.
Overall detection performance for initial contacts and final contacts as quantified by recall, precision, and F1 score.
| Initial Contacts | Final Contacts | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Tracked Point | TP | FN | FP | Recall | Precision | F1 | TP | FN | FP | Recall | Precision | F1 |
| Left ankle | 624 | 19 | 5 | 97% | 99% | 98% | 606 | 32 | 10 | 95% | 98% | 97% |
| Right ankle | 599 | 42 | 8 | 93% | 99% | 96% | 614 | 17 | 12 | 97% | 98% | 98% |
| Left shank | 605 | 38 | 15 | 94% | 98% | 96% | 585 | 53 | 18 | 92% | 97% | 94% |
| Right shank | 603 | 36 | 15 | 94% | 98% | 96% | 595 | 30 | 9 | 95% | 99% | 97% |
TP: true positives, FN: false negatives, FP: false positives, F1: F1 score.
Figure 3Time errors for initial (left) and final (right) contacts detection, for each of the different tracked points.
Time errors for the correctly detected gait events. Note that 0.005 s corresponds to 1 sample period, given the sampling frequency of 200 Hz.
| Initial Contacts | Final Contacts | |||
|---|---|---|---|---|
| Tracked Point |
Median |
IQR |
Median |
IQR |
| Left ankle | 0.000 | 0.020 | 0.000 | 0.010 |
| Right ankle | 0.000 | 0.020 | −0.005 | 0.015 |
| Left shank | −0.005 | 0.020 | −0.005 | 0.020 |
| Right shank | −0.003 | 0.020 | −0.005 | 0.020 |
IQR: inter-quartile range.
Time agreement between the stride-specific parameters.
| Tracked Point | Parameters | Mean Difference | Limits of Agreement |
|---|---|---|---|
| Left ankle | stride time | 0.001 | (−0.035, 0.036) |
| stance time | 0.002 | (−0.039, 0.042) | |
| swing time | −0.001 | (−0.045, 0.043) | |
| Right ankle | stride time | 0.000 | (−0.039, 0.040) |
| stance time | −0.002 | (−0.048, 0.044) | |
| swing time | 0.003 | (−0.046, 0.051) | |
| Left shank | stride time | 0.001 | (−0.039, 0.041) |
| stance time | 0.002 | (−0.043, 0.046) | |
| swing time | −0.001 | (−0.049, 0.047) | |
| Right shank | stride time | −0.000 | (−0.031, 0.031) |
| stance time | 0.002 | (−0.046, 0.049) | |
| swing time | −0.002 | (−0.049, 0.046) |
Figure 4The agreement of extracted gait parameters between the sensor-based and marker-based methods. The differences between the stride-specific temporal gait parameters extracted from the marker-based and proposed approach are plotted against the means.