| Literature DB >> 35378794 |
Wei Chen1,2, Jun Li1,2, Shanying Zhu1,2, Xiaodong Zhang1,2, Yutao Men1,2, Hang Wu3.
Abstract
In recent decades, although the research on gait recognition of lower limb exoskeleton robot has been widely developed, there are still limitations in rehabilitation training and clinical practice. The emergence of interactive information fusion technology provides a new research idea for the solution of this problem, and it is also the development trend in the future. In order to better explore the issue, this paper summarizes gait recognition based on interactive information fusion of lower limb exoskeleton robots. This review introduces the current research status, methods, and directions for information acquisition, interaction, fusion, and gait recognition of exoskeleton robots. The content involves the research progress of information acquisition methods, sensor placements, target groups, lower limb sports biomechanics, interactive information fusion, and gait recognition model. Finally, the current challenges, possible solutions, and promising prospects are analysed and discussed, which provides a useful reference resource for the study of interactive information fusion and gait recognition of rehabilitation exoskeleton robots.Entities:
Year: 2022 PMID: 35378794 PMCID: PMC8976668 DOI: 10.1155/2022/9933018
Source DB: PubMed Journal: Appl Bionics Biomech ISSN: 1176-2322 Impact factor: 1.781
Figure 1Tree diagram displaying available gait analyses.
Figure 2Schematic diagram of a standard procedure implemented for gait recognition using data fusion. The acronyms used in this diagram correspond to the following: SVM: support vector machine; NB: Naive Bayes; LR: logistic regression; KNN: K-nearest neighbors; DT: decision tree; DA: discriminant analysis; GMM: Gaussian mixture model; ANN: artificial neural network; MLP: multilayer perceptron; PNN: probabilistic neural network; TDNN: time delay neural network; NLR: negative likelihood ratio; AUC: area under the curve.
Figure 3Interactive bioelectricity information acquisition methods. (a) Brain machine interface system block diagram [43]. (b) Surface electromyography (sEMG) information acquisition [19].
Figure 4Sensor positions [10, 17, 19, 48–61]. (a) Typical attachment positions of multiple sensors. (b) Type and location of the sensors applied in the papers (IMU: inertial measurement unit; FSR: force sensitive resistor; sEMG: surface electromyography, EEG: electroencephalography; MPJ: metatarsophalangeal joint).
Classification of the target population for the information collection.
| Target population | Sex | Age range | Average weight | Reference |
|---|---|---|---|---|
| Healthy people (6) | M (7) F (2) | 21-34 | 61.25 kg | [ |
| Healthy people (5) | M (3) F (2) | 24-28 | 54.2 kg | [ |
| Healthy people (4) | NA | 21-27 | 73 kg | [ |
| Healthy people (10) | M (10) F (0) | 24-28 | 67 kg | [ |
| Healthy people (4) | NA | 20-27 | NA | [ |
| Healthy people (20) | NA | 20-42 | NA | [ |
| Healthy people bearing loads (10) | M (10) | 25-37 | 77.25 kg | [ |
| Young people (41) | M (35) F (47) | 24-76 | NA | [ |
| Older people (41) | M (16) F (25) | 70-80 | NA | [ |
| Stroke patients (16) | M (12) F (4) | 45-75 | NA | [ |
| Parkinson's patients (11) | M (8) F (3) | 56-78 | NA | [ |
Figure 5Human gait movement cycle division [57].
Feature-level extraction and fusion.
| Original information | Method | Formula | Reference |
|---|---|---|---|
| sEMG | Root mean square (RMS) |
| [ |
| Accelerometer | Mean ( |
| [ |
| Accelerometer | Standard deviation ( |
| [ |
| Accelerometer | Maximum ( |
| [ |
| Accelerometer | Mean amplitude of peaks ( |
| [ |
| Accelerometer | Standard deviation magnitude (| |
| [ |
| Accelerometer | Correlation coefficients ( |
| [ |
N: number of data samples; i:data sample index; X: observation vector at i; σ, σ, and σ are the standard deviation values along the x-, y-, and z-axes, respectively.
Figure 6The schematic of the method proposed in [88]. (ECG: electrocardiography.)
Fusion of multiple information sensors for exoskeletons.
| Fusion method | Sensor type | Sampling rate | Fusion level | Real time/postprocessing | Purpose | Reference |
|---|---|---|---|---|---|---|
| NN | Encoder | NA | Feature level | Real time | Ensure the success rate and reliability of intention detection | [ |
| Filter selection | Vicon | 400 Hz | Feature level | Postprocessing | Effectively select a specified number of sensors | [ |
| ZVU | IMU | 100 Hz | Decision level | Real time | Determine the 3D attitude | [ |
| Arbitration-based score-level fusion | Accelerometer | 100 Hz | Decision level | Postprocessing | Improve recognition accuracy | [ |
| EKF | Accelerometer | 100 Hz | Data level | Real time | Solve sensor installation errors and path integral errors caused by sensor variance | [ |
| RNN | ECG | 50 Hz | Feature level | Postprocessing | Improve robustness | [ |
| NN | MMG | 2048 Hz | Feature level | Real time | Joint torque prediction | [ |
| CNN | Video | NA | Feature level | Postprocessing | Improve the effectiveness and robustness of gait recognition | [ |
| Markovian KF | Potentiometer | 50 Hz | Data level | Postprocessing | Reduce the IMU errors | [ |
| Conjugate gradient algorithm | Accelerometer | 100 Hz | Data level | Postprocessing | Find the minimum of the objective function | [ |
| Multi-modality sensor fusion based on DL | FS | 20 Hz | Feature level | Postprocessing | Overcome the challenge of gait classification from wearable sensors | [ |
| LR | Accelerometer | 40 Hz | Feature level | Postprocessing | Improve the detection and classification of STS | [ |
Note: The abbreviations in the table are reported in the Abbreviation section.
Figure 7The distribution of detection algorithm in the reviewed papers.
Gait recognition methods of lower limb exoskeleton robots.
| Recognition algorithm | Sensor type | Wearable/nonwearable | Real time/postprocessing | Gait recognition | Accuracy rate | Reference |
|---|---|---|---|---|---|---|
| SVMBP | FSR | Wearable | Postprocessing | Phase recognition | 97.4593% | [ |
| HMM | IMU | Wearable | Postprocessing | Phase recognition | 91.88% | [ |
| LDA | sEMG | Wearable | Postprocessing | Phase recognition | 98.56 ± 1.34% | [ |
| QDA | Accelerometer | Wearable | Postprocessing | Phase recognition | >96.5% | [ |
| DCNN | IMU | Wearable | Postprocessing | Phase recognition | 97% | [ |
| BLDA | Position sensor | Wearable | Postprocessing | Phase recognition | 97.8% | [ |
| ED-FNN | IMU | Wearable | Postprocessing | Phase recognition | 97.9 ± 0.1% | [ |
| Ensemble learning-based hybrid deep leaning framework | IMU | Wearable | Postprocessing | Behavior recognition | 99.34% | [ |
| GMM | IMU | Wearable | Postprocessing | Behavior recognition | 99.33%/ | [ |
| DDLMI | Accelerometer | Wearable | Real time | Behavior recognition | 97.64% | [ |
| SA-SVM | IMU | Wearable | Real time | Behavior recognition | 97.47±1.16% | [ |
| IGPG | Motion capture system | Wearable | Real time | Behavior recognition | >97% | [ |
| LMR | Onboard encoders | Wearable | Real time | Behavior recognition | 99.4% | [ |
| MvGGAN | Video | Nonwearable | Postprocessing | Behavior recognition | Higher accuracy | [ |
| MCNN | Video | Nonwearable | Postprocessing | Behavior recognition | Higher accuracy | [ |
| SCN | Video | Nonwearable | Postprocessing | Behavior recognition | 89.8% | [ |
| eMSM | Video | Nonwearable | Postprocessing | Behavior recognition | 88% | [ |
Note: The abbreviations in the table are reported in the Abbreviation section.