| Literature DB >> 28420135 |
Simone Benatti1, Bojan Milosevic2, Elisabetta Farella3, Emanuele Gruppioni4, Luca Benini5,6.
Abstract
Poliarticulated prosthetic hands represent a powerful tool to restore functionality and improve quality of life for upper limb amputees. Such devices offer, on the same wearable node, sensing and actuation capabilities, which are not equally supported by natural interaction and control strategies. The control in state-of-the-art solutions is still performed mainly through complex encoding of gestures in bursts of contractions of the residual forearm muscles, resulting in a non-intuitive Human-Machine Interface (HMI). Recent research efforts explore the use of myoelectric gesture recognition for innovative interaction solutions, however there persists a considerable gap between research evaluation and implementation into successful complete systems. In this paper, we present the design of a wearable prosthetic hand controller, based on intuitive gesture recognition and a custom control strategy. The wearable node directly actuates a poliarticulated hand and wirelessly interacts with a personal gateway (i.e., a smartphone) for the training and personalization of the recognition algorithm. Through the whole system development, we address the challenge of integrating an efficient embedded gesture classifier with a control strategy tailored for an intuitive interaction between the user and the prosthesis. We demonstrate that this combined approach outperforms systems based on mere pattern recognition, since they target the accuracy of a classification algorithm rather than the control of a gesture. The system was fully implemented, tested on healthy and amputee subjects and compared against benchmark repositories. The proposed approach achieves an error rate of 1.6% in the end-to-end real time control of commonly used hand gestures, while complying with the power and performance budget of a low-cost microcontroller.Entities:
Keywords: BSN; EMG; gesture recognition; human machine interaction; prosthetics
Mesh:
Year: 2017 PMID: 28420135 PMCID: PMC5424746 DOI: 10.3390/s17040869
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Diagram of the system architecture.
Figure 2Block diagram of the interface between the MCU and the DC motor driver (left) and example motor current absorption curves (right).
Figure 3SVM algorithm block diagram (a) and memory allocation (b).
Figure 4Diagram of the communication between the personal gateway and the wearable node.
Figure 5Diagram of the FSM for classification mode.
Figure 6Hand control strategy as executed during a gesture sequence.
Figure 7Hand controller during the real time experiment with healthy subjects in open hand (a); precision grasp (b); point index (c) and power grip (d). In figures (a,c) it is possible to see the armband with the EMG sensors.
Gesture recognition accuracy and errors in hand movements decoding of the proposed controller.
| S1 | 88.01 | 178 | 0 | 124 | 0 |
| S2 | 89.76 | 312 | 0 | 209 | 0 |
| S3 | 89.21 | 229 | 0 | 155 | 0 |
| S4 | 86.34 | 404 | 0 | 204 | 0 |
| S5 | 83.49 | 311 | 0 | 206 | 0 |
| MEAN | 87.37 | 296 | 0 | 179 | 0 |
| S1 | 94.86 | 166 | 0 | 55 | 0 |
| S2 | 93.65 | 262 | 0 | 38 | 0 |
| S3 | 81.38 | 393 | 1 | 290 | 0 |
| S4 | 86.43 | 1316 | 1 | 729 | 1 |
| MEAN | 89.09 | 534 | <1 | 278 | <1 |
| S1 | 90.46 | 142 | 0 | 116 | 0 |
| S2 | 93.49 | 116 | 4 | 107 | 4 |
| S3 | 92.89 | 132 | 0 | 67 | 0 |
| S4 | 92.15 | 226 | 0 | 169 | 0 |
| S5 | 90.15 | 158 | 0 | 114 | 0 |
| MEAN | 91.83 | 155 | <1 | 112 | <1 |
Figure 8Memory footprint of the SVM models in kBytes (a), computation requirements in clock cycles (b), varying the number of SVM features.
Figure 9Computation times expressed in ms for the two analyzed setups.