| Literature DB >> 29765462 |
Makrem Mrabet1, Yassine Rabhi1, Farhat Fnaiech1.
Abstract
Despite the diversity of electric wheelchairs, many people with physical limitations and seniors have difficulty using their standard joystick. As a result, they cannot meet their needs or ensure safe travel. Recent assistive technologies can help to give them autonomy and independence. This work deals with the real-time implementation of an artificial intelligence device to overcome these problems. Following a review of the literature from previous work, we present the methodology and process for implementing our intelligent control system on an electric wheelchair. The system is based on a neural algorithm that overcomes problems with standard joystick maneuvers such as the inability to move correctly in one direction. However, this implies the need for an appropriate methodology to map the position of the joystick handle. Experiments on a real wheelchair are carried out with real patients of the Mohamed Kassab National Institute Orthopedic, Physical and Functional Rehabilitation Hospital of Tunis. The proposed intelligent system gives good results compared to the use of a standard joystick.Entities:
Year: 2018 PMID: 29765462 PMCID: PMC5885488 DOI: 10.1155/2018/2063628
Source DB: PubMed Journal: Appl Bionics Biomech ISSN: 1176-2322 Impact factor: 1.781
Figure 1The posture of a dystonia patient's hand.
Figure 2The proposed system structure.
Intelligent wheelchair tension ranges.
| Output1 | Output2 | |
|---|---|---|
| Stop | 2.5 V | 2.5 V |
| Forward | 2.5 V | 2.5 V~3.9 V |
| Backward | 2.5 V | 1.1 V~2.5 V |
| Turn right | 2.5 V~3.9 V | 2.5 V |
| Turn left | 1.1 V~2.5 V | 2.5 V |
Figure 3The desired positions of the joystick.
Figure 4Synoptic of the smart joystick.
Figure 5Learning phase of the intelligent joystick.
Characteristics of the proposed intelligent wheelchair.
| Wheelchair parameters | A processing device (Raspberry Pi II model B) | ||
|---|---|---|---|
| Height | 89 cm | Price | $39.99 |
| Width | 61 cm | Chip | Broadcom BCM2836 |
| Frame weight with batteries | 58 kg | Processor | ARMv7 quad-core |
| Load capacity | 110 kg | Processor speed | 900 MHz |
| Linear velocity | 8 km/h | Voltage and power draw | 650 mA @ 5 V |
| Ø front wheels | 20 cm | GPU` | Dual Core VideoCore IV Multimedia Co-Processor |
| Ø rear wheels | 30 cm | Size | 85 × 56 mm |
| Stopping distance | 1 m | Memory | 1 GB SD RAM @ 400 MHz |
| Noise | 65 dBA | GPIO | 40 |
| Battery life | 20 km | USB 2.0 | 4 |
| Battery | 2 × 12 v 28 Ah | Ethernet | 10/100mb Ethernet RJ45 Jack |
| Engine power | 2 × 220 W 24 V | Audio | Multichannel HD audio over HDMI, analog stereo from 3.5 mm headphone jack |
Patient characteristics.
| Characteristics | Patient I | Patient II | Patient III |
|---|---|---|---|
| Sex, age, mass (kg) | Male, 15, 54 | Male, 16, 45 | Male, 66, 75 |
| Diagnosis | Posttraumatic tetraplegia | Cerebral palsy | Cervical myelopathy |
| Motor disability and clinical symptoms | Spastic tetraplegia C5 | Dyskinetic tetraplegia | Spastic tetraplegia |
| Handedness | Left-handed | Right-handed | Left-handed |
| Functional level | FIM: 82 | GMFCS: V | FIM: 86 |
Figure 6Real participants in the trial phase.
Figure 7The training trajectory is divided into three parts: driving in a straight line, right turn, and left turn.
Structure and training results for the neural network models.
| Patient | Net structure | Training function | Momentum | MSE test | Iterations | Time |
|---|---|---|---|---|---|---|
| Patient I | 2-16-16-2 | Hyperbolic-tangent sigmoid | 0.5 | 0.00816 | 1437 | 0:05:51 |
| Patient II | 2-14-14-2 | Hyperbolic-tangent sigmoid | 0.5 | 0.00898 | 1511 | 0:05:17 |
| Patient III | 2-17-17-2 | Hyperbolic-tangent sigmoid | 0.5 | 0.009238 | 1848 | 0:08:53 |
Figure 8Data recorded during the data collection phase of the first patient and with recurrent neural network corrector in the polar base.
Figure 9Plots of trajectories as driven by patients with standard driving mode within the scope of experimental test runs.
Figure 10Plots of trajectories as driven by first test patient with assisted driving mode within the scope of experimental test runs.
Figure 11Plots of trajectories as driven by test patients with assisted driving mode within the scope of experimental test runs.
Figure 12Recorded data from the first patient during the manoeuvre in test number 5.
Performance indices from the users' paths.
| Path length (m) |
|
| NC | TC (s) | NR | HD | |
|---|---|---|---|---|---|---|---|
| Path made by an expert | 15.8983 | 71.39407 | 0.2227 | 0 | 0 | 18 | 0.6737 |
| Disabled path (patient I) | 38.0598 | 249.3485 | 0.1526 | 5 | 5.8832 | 11 | 1.4203 |
| Intelligent joystick path (patient I) | 18.1164 | 102.6936 | 0.1764 | 0 | 0 | 15 | 0.9215 |
| Disabled path (patient II) | 25.2091 | 197.7362 | 0.1275 | 3 | 4.5483 | 8 | 1.6728 |
| Intelligent joystick path (patient II) | 16.4844 | 111.8486 | 0.1474 | 0 | 0 | 14 | 0.8633 |
| Disabled path (patient III) | 36.2845 | 225.3885 | 0.1610 | 5 | 5.9554 | 8 | 1.9099 |
| Intelligent joystick path (patient III) | 21.1313 | 128.2292 | 0.1648 | 1 | 2.6652 | 13 | 1.0678 |
Figure 13Comparison between the trajectories of the second patient with and without the proposed intelligent joystick: (a) Virtual environment. (b) Trajectories of the first user. (c) Trajectories of the second user. (d) Trajectories of the third user.
Figure 14Intelligent wheelchair controls in real environments.
Performance indices for assessing simulated a wheelchair driven by the proposed joystick.
| Collisions | Incomplete assignment | Path length (m) | Time (s) | Mean velocity (m/s) | Static time (s) | Total time (s) | |
|---|---|---|---|---|---|---|---|
| Expert | 0 | 0 | 178 | 143 | 1.24 | 0 | 143 |
| Patient I (PI) | 0 | 0 | 183.1 | 158 | 1.158 | 9 | 167 |
| Patient II (PII) | 0 | 0 | 183.8 | 159 | 1.155 | 7 | 166 |
| Patient III (PIII) | 1 | 1 | 186 | 169 | 1.100 | 14 | 183 |
Wheelchair controls in literature.
| Wheelchair | Feature | Device | Commands | ||
|---|---|---|---|---|---|
| Intrusive interfaces | Chen et al. [ | Head orientation | Tilt sensors, microprocessor | Go, back, left, and right | |
| SIAMO project [ | Eye gaze | Electrode | Go, back, left, and right | ||
| Wheelesley [ | Eye gaze | Electrodes (EOG) | Go, back, left, and right | ||
|
| |||||
| Nonintrusive interfaces | Voice | SIAMO project [ | Voice | Microphone | Go, back, left, and right |
| ROB Chair [ | Voice | Head microphone | Go, stop, speed up, speed down, and rotate | ||
| NAVChair [ | Voice | Computer | Go, stop, back, left, and right | ||
| TAO project [ | Voice | Microphone | Go, stop, back, left, right, and speed down | ||
| Vision | Yoshida et al. [ | Face | Tow video camera | Go, stop, left, and right | |
| HGI [ | Head & nose | Webcam, data acquisition board | Go, left, right, speed up, and speed down | ||
| SIAMO [ | Head | CCD camera | Go, left, right, speed up, and speed Down | ||
| Rabhi et al. [ | Hand | Webcam, data acquisition board | Analog commands | ||
| Proposed smart joystick | Joystick | Data acquisition board | Analog commands | ||