| Literature DB >> 34779786 |
Daniel Ayo Oladele1, Elisha Didam Markus1, Adnan M Abu-Mahfouz2.
Abstract
BACKGROUND: With the projected upsurge in the percentage of people with some form of disability, there has been a significant increase in the need for assistive mobility devices. However, for mobility aids to be effective, such devices should be adapted to the user's needs. This can be achieved by improving the confidence of the acquired information (interaction between the user, the environment, and the device) following design specifications. Therefore, there is a need for literature review on the adaptability of assistive mobility devices.Entities:
Keywords: adaptability; assistive mobility devices; internet of medical things framework; internet of things; mobile phone; mobility aids; multisensor fusion; user system interface
Year: 2021 PMID: 34779786 PMCID: PMC8663709 DOI: 10.2196/29610
Source DB: PubMed Journal: JMIR Rehabil Assist Technol ISSN: 2369-2529
Figure 1Flow diagram of search results. IoT: internet of things.
Summary of the Society of Automotive Engineers (SAE) automation levels.
| SAE [ | DDTa | Driving supervision (DDT fallback) | Scenarios (ODDb) | |
|
| Vehicle controls | Environment monitoring (OEDRc) |
|
|
| 0: no driver automation | Driver | Driver | Driver | N/Ad |
| 1: driver assistant | Driver | Driver | Driver | Limited |
| 2: partial driving automation | Driver and vehicle | Driver | Driver | Limited |
| 3: conditional driving automation | Vehicle | Vehicle | Driver and vehicle | Limited |
| 4: high driving automation | Vehicle | Vehicle | Vehicle | Limited |
| 5: full driving automation | Vehicle | Vehicle | Vehicle | Unlimited |
aDDT: dynamic driving task.
bODD: operational design domain.
cOEDR: object and even detection and response.
dN/A: not applicable.
Figure 2Summary of an autonomous system.
Brain-computer interface (BCI) technologies for adaptive assistive mobility devices.
| Brain signals and auxiliary sensors | Classifier for feature extraction | Output command | Contributions | Drawbacks |
| P300 (laser scanner) [ | Stepwise linear discriminant analysis | A predefined set of locations and stops | High accuracy, no training required, and autonomous navigation after successful selection | Low information transfer rate, predefined paths, limited testing scenarios, and possible fatigue after long focus period of the eye on the target stimulus |
| P300 (odometer, barcode scanner, and a proximity sensor) [ | Support vector machine | A predefined set of locations and stops | Same as Rebsamen et al [ | Same as Rebsamen et al [ |
| MIa-based mu rhythm and the P300 [ | One versus the rest common spatial patterns transformation matrix | Left, right, accelerate, and decelerate | Improved performance | Limited testing scenarios and possible fatigue after long focus period of the eye on the target stimulus |
| MI-based BCI (10 sonar sensors and 2 webcams) [ | Gaussian classifier | Left, right, and keep moving forward | Spontaneous and shared control | Limited testing scenarios, requires extensive training, and limited classes (typically three) |
| Steady-state visual evoked potentials (camera and adaptive fuzzy controller) [ | Frequency recognition algorithm based on multivariable synchronization index | Left, right, upwards, and downwards | Teleoperation control of an exoskeleton using a brain-machine interface | Possible fatigue after a long focus period of the eye on a target stimulus and a significant reduction in recognition accuracy for inexperienced subjects |
| Steady-state somatosensory evoked potential [ | Regularized linear discriminant analysis | Turn left, turn right, and move forward | Spontaneous, first of its kind, and addressed the possible fatigue after a long focus period of the eye on the target stimulus | Only healthy subjects were used, with limited testing scenarios (two) |
aMI: motor imagery.
Computer vision (CV) interface technologies for adaptive assistive mobility device.
| Technology name (type): additional sensors | Machine learning tools | Contributions | Drawbacks |
| See ColOr (CV, auditory and haptic): 3D Kinect, iPad, and Bone-Phones [ | Multilayer artificial neural network for | A framework for the coupling of optical sensors in the context of range and color image registration and the development of a sonic code that maps colors and depth into musical instruments | Extensive training was required, and testing was limited to certain scenarios. |
| Wearable mobility aid for patients with visual impairments (visual, auditory, and haptic): RGBDa, vibrotactile glove, and bone-conductive headsets [ | Stereo vision algorithm and semiglobal matching algorithm; detection: random sample consensus algorithm and Kalman filter; categorization: convolution neural network | Improves on a preliminary prototype of Mattoccia [ | Patient feedback from the Mattoccia [ |
| Visual servoing-controlled wheelchair (vision): 1 camera for corridor following and 2 cameras for ADPb [ | Classic Gaussian sphere projection framework, door detection and tracking framework, and a 2D edge tracker inspired by the moving edge algorithm | Addresses, in a secure way, the autonomous stability of the wheelchair’s position along corridors and also detects and passes through doorways using visual data | Human input in the control was not considered. |
| iChair (vision, auditory and haptic): high-definition camera, 3D scanner, 10 LEDsc, touch screen and voice recognition app, and head mouse [ | Light communication algorithm, collision avoidance algorithm, and an emergency and stress detection algorithm | A multimodal input smart wheelchair to identify and classify objects, build 3D maps, and eventually facilitate autonomous navigation | A bug-free human trial has not yet been documented. |
| CV for patients with visual impairment (vision, auditory, and haptic): A stereo RGBd camera (SC), a depth-of-field camera, and an IMUe [ | Detection and tracking algorithm, support vector machine classifier, and a class-specific extremal regions for text detection | Addresses the pervasiveness requirement as well as offers sensory substitution via sound feedback to patients with visual impairment | The outdoor performance noted clustering of several objects into a single one and error in identifying lower parts of the object; no outdoor and usability test was documented. |
| Autonomous scooter navigation (vision): MPU-9250 IMU, long-range laser, and stereo vision camera [ | A graph-based simultaneous localization and mapping algorithm | Cost-effective and addresses the navigational and localization challenges in an unknown environment by a new hybrid far-field and near-field mapping solution | Extensive documentation of human testing has not been documented. |
| User-adaptive intelligent robotic walker (vision): laser range finder [ | Interacting multiple model particle filters with probabilistic data association framework, Viterbi algorithm (human gait estimation), support vector machine classifier, and unscented Kalman filter algorithm | Human state estimation, pathological gait parametrization, and characterization for classifying users associated with risk fall | A test to evaluate the performance of the control strategy with the robotic mobility assistive device and patients was not documented. |
aRGBD: red green blue and depth.
bADP: autonomous doorway passing.
cLED: light emitting diode.
dRGB: red green blue.
eIMU: inertial measurement unit.
Brain-computer interface technologies for adaptive assistive mobility devices.
| Technology name (type): additional sensors | Machine learning tools | Contributions | Drawbacks |
| TDS-iPhone-PWCa (haptic): magnetic sensors [ | Sensor signal processing algorithm | An alternative USIb for people with spinal cord injury or upper limb paralysis | Tongue piercing can be a painful and uncomfortable option for some users. Extensive training is required for calibration. |
| Intelligent smart walker (haptic): force or torque sensor [ | N/Ac | An intuitive rule-based speed controller for a smart walker | Young and healthy subjects were used, so the result is not a true representation of the typical users of the walker. |
| EyeCane (CVId, haptic, and auditory): infrared emitters, auditory frequency actuator, and tactile actuator [ | N/A | Low cost, lightweight, small and easy to use electronic travel aid for distance estimation and navigational assistance, long battery life (one whole day), intuitive to the user, and short training time (<5 minutes) | Only an indoor experiment was conducted. |
| Electronic mobility cane (CVI, haptic, and auditory): liquid detection, 6 ultrasonic sensors, a metal detector, a microvibration motor, and a mono earphone [ | A novel algorithm named | Offers real time multiple obstacle detection and way-finding assistance simultaneously to patients with visual impairments by an auditory (voice message) and tactile (vibration) feedback | Extensive training time (20 hours); the cognitive and perceptual load has not been ascertained |
| Jet Propulsion Laboratory BioSleeve (haptic): electromyography and IMUe sensors [ | A multiclass support vector machine classifier | Intuitive control of robotic platforms by decoding as many as 20 discrete hand and finger gestures | Has not yet been integrated and tested with assistive mobility aids to determine its applicability |
| Smart cane (haptic): IMU and FSRf sensors [ | C4.5 decision tree, artificial neural network, support vector machine, and naive bayes | To monitor and distinguish between different walk-related activities during gait rehabilitation | Fall and near-fall detection was not considered in its design and implementation. |
| An ARTAg power wheelchair platform (CVI and haptic): haptic controller, laser scanner, SICK laser measurement, and IMU sensor. [ | Gaussian process regression model | Implementation of a learned shared control policy from human-to-human interaction | The efficiency of the learning process is dependent on the human assistant, who is prone to errors and might miss out on the certain intent of the user. |
| Multiple controlled interfaces smart wheelchair (haptic and auditory): microphone, joystick, leap motion, and ultrasonic sensor [ | An algorithm for the control and execution of commands | Multiple control interfaces | Lack of details on the performance of each interface and limited testing scenarios |
| MyoSuit (haptic): IMU sensor and two electric motors [ | N/A | Lightweight, soft wearable robot to aid users with a level of residual mobility during locomotion tasks | Only one incomplete spinal cord injury participant was selected for testing, so it is difficult to validate its performance. |
aTDS-iPhone-PWC: tongue drive system to iPhone electric-powered wheelchair
bUSI: user system interface.
cN/A: not applicable.
dCVI: computer vision interface.
eIMU: inertial measurement unit.
fFSR: force sensitive resistor.
gARTA: assistive robotic transport for adults.
Internet of medical things technologies for adaptive assistive mobility devices.
| Name of framework | Management system or algorithms | Contributions and functions | Drawbacks |
| NBRa system framework [ | Modified linear quadratic Gaussian algorithm | Distributes the control of a mobility device between the patient’s side and the physiotherapist’s side; brings convenience to patients and therapists | Only simulations and experiments have been conducted. |
| Global concept SEESb framework [ | Intelligent transportation system | Designed to address the walking and orientation problem | Only one simple experiment has been conducted. |
| SHSc framework [ | Hybrid sensing network, the IoTd smart gateway, and the user interfaces for data visualization and management | Monitoring and tracking of patients, personnel, and biomedical devices in real time; collecting both environmental conditions and patient’s physiological parameters and delivering them to a control center | Use-case scenario testing has not been conducted except for fall detection of 1 patient. |
| ROSe framework [ | Navigation, localization, and pick and place algorithm | For the cooperation among SWCf and RWg; for the user to be able to interact with and control the SWC as well as any object connected to the RW | At present, the whole architecture has been tested in simulation only. |
aNBR: network-based rehabilitation system.
bSEES: Smart Environment Explorer Stick.
cSHS: smart health care system.
dIoT: internet of things.
eROS: robotic operating system.
fSWC: smart wheelchairs.
gRW: robotic workstations.