| Literature DB >> 31480451 |
Aliaksei L Petsiuk1, Joshua M Pearce2,3,4.
Abstract
Nineteen million Americans have significant vision loss. Over 70% of these are not employed full-time, and more than a quarter live below the poverty line. Globally, there are 36 million blind people, but less than half use white canes or more costly commercial sensory substitutions. The quality of life for visually impaired people is hampered by the resultant lack of independence. To help alleviate these challenges this study reports on the development of a low-cost, open-source ultrasound-based navigational support system in the form of a wearable bracelet to allow people with the lost vision to navigate, orient themselves in their surroundings and avoid obstacles when moving. The system can be largely made with digitally distributed manufacturing using low-cost 3-D printing/milling. It conveys point-distance information by utilizing the natural active sensing approach and modulates measurements into haptic feedback with various vibration patterns within the four-meter range. It does not require complex calibrations and training, consists of the small number of available and inexpensive components, and can be used as an independent addition to traditional tools. Sighted blindfolded participants successfully demonstrated the device for nine primary everyday navigation and guidance tasks including indoor and outdoor navigation and avoiding collisions with other pedestrians.Entities:
Keywords: 3-D printing; additive manufacturing; assistive devices; blind; obstacle avoidance; sensors; sensory substitution; ultrasonic sensing; ultrasound sensing; visually impaired
Mesh:
Year: 2019 PMID: 31480451 PMCID: PMC6749373 DOI: 10.3390/s19173783
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Parts of an open-source navigational support with 3-D printable case components: (a) 3-D prototype; (b) Components; (c) Model 1 with one vibration motor; (d) Model 2 with two vibration motors; (e) Locking rings; (f) Case; (g) Vibration pad; (h) Sensor core; (i) Back cap; (j) Bracelet; (k) Assembly.
Bill of materials for the open-source ultrasound-based navigational support.
| Component | Quantity | Cost, USD |
|---|---|---|
| 3-D printed case | 1 | 0.65 |
| 3-D printed back cap | 1 | 0.25 |
| 3-D printed bracelet | 1 | 0.40 |
| 3-D printed vibration motor pad | 1 | 0.05 |
| 3-D printed locking rings | 2 | 0.05 |
| Arduino Nano | 1 | 3.80 |
| Ultrasonic Sensor HC-SR04 | 1 | 1.83 |
| Flat 10 mm 3 V vibration motor | 1 | 1.40 |
| 400 mAh lithium polymer battery | 1 | 7.49 |
| Micro USB 5 V 1 A 18650 TP4056 lithium battery charger | 1 | 1.20 |
| *DC-DC 5V boost step-up module (optional) | *1 | *5.99 |
| Slide switch | 1 | 0.40 |
| 0.25W 1 kΩ resistor | 2 | <0.01 |
| Ceramic 0.1uF capacitor | 1 | 0.07 |
| 1N4007 diode | 1 | 0.08 |
| 2N2222 transistor | 1 | 0.07 |
| 5 mm LED | 1 | 0.07 |
|
| ||
Figure 2Electrical circuit.
Figure 3The ultrasonic sensor operating principles: (a) The principal distances (not to scale); (b) Calibration of the optimal duty cycle equation for the distance range of 35 cm to 150 cm, where (c) MDC = 127 + 127 · tanh (−(D − 70) / 35); (d) MDC = 127 + 127 · tanh (-(D - 150) / 35); (e) MDC = 296 – 1.5 · D; (f) MDC = 335 – 1.3 · D; (g) MDC = −77 + 2.2 · D; (h) MDC = −48 + 1.2 · D.
Figure 4Calibration procedure of the duty cycle modulation based on hyperbolic tangent function (4): (a) Hand swinging; (b) Wall following; (c) Obstacle detection; and (d) Curbs tracking.
Figure 5Human haptic sensitivity and vibration motor characteristics: (a) Psychophysically determined thresholds for detection of different frequencies of vibrotactile stimulation (adapted from [82]); (b) Vibration motor performance (adapted from [70]).
Figure 6Testing procedure. (a) Walk along the corridor with an unknown obstacle; (b) Bypass several corners indoors; (c) Walk through the staircase; (d) Wall fallowing; (e) Detect the open door; (f) Detect an obstacle on the street; (g) Bypass an obstacle on the street; (h) Avoid collisions with pedestrians; (i) Interact with known objects.
Results of the experiments.
| Participants | Tests | |||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| a | b | c | d | e | f | g | h | i | ||||||||||
| Device Model | ||||||||||||||||||
| 1 | 2 | 1 | 2 | 1 | 2 | 1 | 2 | 1 | 2 | 1 | 2 | 1 | 2 | 1 | 2 | 1 | 2 | |
| 1 | • | ‒ | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • |
| 2 | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • |
| 3 | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • | • | ‒ |
| 4 | • | • | • | • | ‒ | ‒ | • | • | • | • | • | • | • | ‒ | • | • | • | • |
| 5 | • | • | • | • | • | • | • | • | • | • | • | • | • | ‒ | • | • | • | • |
• Success, ‒ Failure.