| Literature DB >> 32680530 |
Michael Tschiedel1,2, Michael Friedrich Russold3, Eugenijus Kaniusas4.
Abstract
Modern lower limb prostheses have the capability to replace missing body parts and improve the patients' quality of life. However, missing environmental information often makes a seamless adaptation to transitions between different forms of locomotion challenging. The aim of this review is to identify the progress made in this area over the last decade, addressing two main questions: which types of novel sensors for environmental awareness are used in lower limb prostheses, and how do they enhance device control towards more comfort and safety. A literature search was conducted on two Internet databases, PubMed and IEEE Xplore. Based on the criteria for inclusion and exclusion, 32 papers were selected for the review analysis, 18 of those are related to explicit environmental sensing and 14 to implicit environmental sensing. Characteristics were discussed with a focus on update rate and resolution as well as on computing power and energy consumption. Our analysis identified numerous state-of-the-art sensors, some of which are able to "look through" clothing or cosmetic covers. Five control categories were identified, how "next generation prostheses" could be extended. There is a clear tendency towards more upcoming object or terrain prediction concepts using all types of distance and depth-based sensors. Other advanced strategies, such as bilateral gait segmentation from unilateral sensors, could also play an important role in movement-dependent control applications. The studies demonstrated promising accuracy in well-controlled laboratory settings, but it is unclear how the systems will perform in real-world environments, both indoors and outdoors. At the moment the main limitation proves to be the necessity of having an unobstructed field of view.Entities:
Keywords: Artificial limb; Contralateral; Environment; Locomotion mode estimation; Prosthesis control; Systematic review; Terrain
Mesh:
Year: 2020 PMID: 32680530 PMCID: PMC7368691 DOI: 10.1186/s12984-020-00726-x
Source DB: PubMed Journal: J Neuroeng Rehabil ISSN: 1743-0003 Impact factor: 4.262
Fig. 1Control framework. Dynamics between a prosthetic device, a user, and his environment. The hierarchical controller estimates the patient’s intent at the high-level, translates it into device states at the mid-level and finally executes these commands at the lower level. Environmental awareness is achieved by observing the user (IES) or the environment (EES). Adapted from Tucker et al. 2015 [6]
Fig. 2Search process. Flow diagram of database search and paper selection based on inclusion and exclusion criteria throughout the different phases of the literature review process
Overview of records reviewed
| Vallery et al. | IES / 1 | 2 x angle & angular | Mapping function for control of knee prototype | |
| ( | velocity sensors | with estimated contralateral limb motion data. | ||
| Bernal-Torres et al. | IES / 1 | 1 x IMU | Active biomimic polycentric knee prototype with | |
| ( | contralateral echo-control strategy. | |||
| Su et al. | IES / 1 | 3 x IMUs | Intent recognition system based on | |
| ( | ankle | convolutional neural network classification. | ||
| CYBERLEGs | IES / 1 | 2 x pressure insoles | Finite-state control of a powered ankle-knee | |
| project series 1 | 7 x IMUs | coupled prototype using whole-body aware | ||
| ( | feet & 1 x trunk | noninvasive, distributed wireless sensor control. | ||
| Hu et at. | IES / 2 | 4 x IMUs | Classification error reduction through fusion of | |
| ( | 4 x GONIOs | bilateral lower-limb neuromechanical signals, | ||
| Extended by: | 14 x EMGs | providing feasibility & benchmark datasets. | ||
| Krausz et al. | EES / 2 | 1 x IMU | On the waist in | Adding vision features to the prior |
| ( | 1 x depth camera | a belt construction | concept improving the classification. | |
| Hu et al. | IES / 3 | 1 x IMU | Bilateral gait segmentation from ipsilateral depth | |
| ( | 1 x depth camera | sensor with the contralateral leg in field of view. | ||
| Zhang et al. | IES / 3 | 1 x depth camera | On the waist | Depth signal from legs as input to an |
| ( | with tilt angle | oscillator-based gait phase estimator. | ||
| Scandaroli et al. | EES / 4 | 2 x gyroscopes | Built into a | Infrared distance sensor setup for estimation |
| ( | 4 x infrared sensors | foot prototype | of foot orientation with respect to ground. | |
| Ishikawa et al. | EES / 4 | 2 x infrared sensors | Left & right on | Infrared distance sensor setup for estimation |
| ( | 1 x IMU | one normal shoe | of foot clearance with respect to ground. | |
| Kleiner et al. | EES / 5 | 1 x motion tracking | Concept and prototype of a foresighted | |
| ( | 1 x laser scanner | ankle & knee joint | control system using a 2D laser scanner. | |
| Huang’s group 2 | EES / 5 | 1 x IMU | Terrain recognition based on laser distance, | |
| ( | 1 x laser sensor | of the trunk | motion estimation and geometric constrains. | |
| Carvalho et al. | EES / 5 | 1 x laser sensor | On the waist | Terrain recognition based on laser distance |
| ( | with 45° tilt angle | information and geometric constrains. | ||
| Sahoo et al. | EES / 5 | 3/4 x range sensors | Array of distance sensors for geometry-based | |
| ( | 1 x force resistor | on the heel of the foot | obstacle recognition in front of the user. | |
| Varol et al. and | EES / 5 | 1 x depth camera | Intent recognition framework using a single | |
| Massalin et al. | depth camera and a cubic kernel support | |||
| ( | vector machine for real-time classification. | |||
| Laschowski et al. | EES / 5 | 1 x color camera | Wearable | Terrain identification based on color images |
| ( | chest-mounting | and deep convolutional network classification. | ||
| Yan et al. | EES / 5 | 1 x depth camera | On the trunk | Locomotion mode estimation based on depth |
| ( | in 1.06m height | feature extraction and finite-state classification. | ||
| Diaz et al. | EES / 5 | 1 x IMU | Terrain context identification and inclination | |
| ( | 1 x color camera | estimation based on color image classification. | ||
| Krausz et al. | EES / 5 | 1 x depth camera | Fixed in 1.5m height | Stair segmentation strategy from depth |
| ( | 1 x accelerometer | with -50° tilt angle | sensing information of the environment. | |
| Kleiner et al. | EES / 5 | 1 x IMU | Stair detection algorithm through fusion of | |
| ( | 1 x radar sensor | motion trajectory and radar distance data. | ||
| Zhang et al. | EES / 5 | 1 x IMU | Environmental feature extraction based on | |
| ( | 1 x depth camera | neural network depth scene classification. |
1Publications through CYBERLEG: Amrozic et al. [15, 16], Gorsic et al. [17] and through CYBERLEG++: Parri et al. [18]
2Research group from Huang: F. Zhang et al. [30], X. Zhang et al. [31], Wang et at. [32] and Liu et al. [33]
Fig. 3Sensor comparison. Different sensors used within the retrieved publications were divided into the three categories: Distance & depth, Kinematic and Other. Update rate describes the number of measurements per second. The rating scale: (low), (medium) and (high) is used instead of absolute values, representing a scale from approximately 10 Hz up to 100 Hz for real-time applications. The smallest change that can still be detected by a sensor is its Resolution. The rating scale: (low), (medium) and (high) is used instead of absolute values, representing a scale from several centimeters down to the millimeter range. Unobstructed field of view indicates whether the sensor functionality does or does not require an unobstructed field of view: (yes/no). If its not applicable, this is indicated by: (n/a)
Fig. 4Control landscape. Control strategy landscape overview based on required resolution and update rate of the underlying sensor modality