| Literature DB >> 28287451 |
Wafa Elmannai1, Khaled Elleithy2.
Abstract
The World Health Organization (WHO) reported that there are 285 million visuallyimpaired people worldwide. Among these individuals, there are 39 million who are totally blind. There have been several systems designed to support visually-impaired people and to improve the quality of their lives. Unfortunately, most of these systems are limited in their capabilities. In this paper, we present a comparative survey of the wearable and portable assistive devices for visuallyimpaired people in order to show the progress in assistive technology for this group of people. Thus, the contribution of this literature survey is to discuss in detail the most significant devices that are presented in the literature to assist this population and highlight the improvements, advantages, disadvantages, and accuracy. Our aim is to address and present most of the issues of these systems to pave the way for other researchers to design devices that ensure safety and independent mobility to visually-impaired people.Entities:
Keywords: assistive devices; navigation and orientation systems; obstacles avoidance; obstacles detection; visually-impaired people
Mesh:
Year: 2017 PMID: 28287451 PMCID: PMC5375851 DOI: 10.3390/s17030565
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Classification of electronic devices for visually-impaired people.
The most important features that correspond to the user’s needs.
| Feature | Description |
|---|---|
| Analysis Type | The system needs to provide a fast processing for the exchanged information between the user and sensors. For example, the system that detects the obstacle that is 2 m in front of the user in 10 s cannot be considered as real time system [ |
| Coverage | The system needs to provide its services indoors and outdoors to improve the quality of visually-impaired people’s lives |
| Time | The system should perform as well in day time as at night time |
| Range | It is the distance between the user and the object to be detected by the system. Ideal minimum range is 0.5 m, whereas the maximum range should be more than 5 m. Further distance is better |
| Object Type | The system should avoid the sudden appearance objects, which means the system should detect the dynamic objects as the static objects |
Figure 2The Smart Cane prototype [19].
Figure 3The prototype of the eye substitution device [20].
Figure 4Reflection of sequence of ultrasonic pulses between the sender and receiver.
Figure 5Ranges that are covered by ultra-sonic sensors [20].
Figure 6An assistive device for blind people based on a map matching approach and artificial vision [22].
Figure 7The result of mapping both commercial Geographical Information System (GIS) and Global Position System (GPS)’s signals is P1. P2 is the result of mapping the signals of GPS with adapting GIS [22].
Figure 8(a) The design of the antenna at the front and (b) at the back; (c) fabricated antenna at the front; (d) at the back and [30].
Figure 9Tongue-placed electro-tactile system with sunglasses carries object detection camera [28] (a) sunglasses with detective camera of objects; (b) tongue electro tactile device.
Figure 10(a) Matrix of electrode; (b) Different eight directions for the matrix of electrodes [30].
Figure 11The overall design of the system [30].
Figure 12Design of the sensor module [31].
Figure 13Distance of the frequency detection on sidewalk [32].
Figure 14The prototype of the proposed device [34].
Figure 15(a) The results of the device’s orientation in residential area; (b) The results of the device’s orientation in civilian [34].
Figure 16The prototype of grip [35].
Figure 17The proposed device for enhanced spatial sensitivity [35].
Figure 18(a) The prototype of the device; (b) Detection process of the obstacle from 5 cm to 150 cm [38].
Figure 19(a) The prototype of the proposed system; (b) calculating threshold value and the distance of the closest object [39].
Figure 20Display the proposed system mounted on the special electronic wheelchair [41].
Figure 21The process of detection and recognition algorithm [43].
Figure 22The proposed system attached on silicon glove [45].
Figure 23The prototype of Path Force Feedback belt design [46].
Figure 24The detection process of force feedback belt [46].
Figure 25(a) The prototype of the EyeRing; (b) The process of EyeRing device of detecting and interaction application [48].
Figure 26The prototype of FingerReader [47].
Figure 27The process of the extraction and detection of printed text line [47].
Figure 28The proposed device [51].
Figure 29The process of the extraction and expand the range detection text [51].
Figure 30The implemented app [53].
Figure 31The proposed application’s dataflow [53].
Figure 32The proposed crutch with displayed detection ranges [54].
Figure 33Replacement of three ultrasonic sensors on the cane [54].
Figure 34The design of ultrasonic headset [55].
Figure 35(a,b) Display the proposed ultrasonic headset with illustrating of the circuit and the solar panels [55].
Figure 36The proposed system to be mounted on the head [56].
Figure 37The accumulation of the interval time for forming a visual frame and the entire system is illustrated (the event distance is differentiated via colors) [56].
Figure 38The prototype of the proposed system [60].
Figure 39The process of the proposed navigation system [60].
Figure 40The system’s installation inside a room [61].
Figure 41The proposed architecture [61].
Score and evaluation for each system.
| System | Features | |||||
|---|---|---|---|---|---|---|
| Real Time/not Real Time | Coverage (Indoor, Outdoor, both) | Time (Day, Night, both) | Range (R ≤ 1 m, 1 m < R ≤ 5 m, R > 5 m) | Object Type (Static, Dynamic, both) | Total Score | |
| Weight of 10 | ||||||
| *Smart Cane | 10 | 5 | 5 | 5 | 5 | 62 |
| *Eye Subs | 10 | 5 | 10 | 5 | 5 | 72 |
| *FAV&GPS | 10 | 5 | 5 | - | 10 | 62 |
| *BanknotRec | 10 | 5 | 5 | - | 5 | 52 |
| *TED | 10 | 5 | 10 | - | 5 | 62 |
| *CASBlip | 10 | 10 | 10 | 5 | 5 | 82 |
| *RFIWS | - | 5 | 10 | 5 | 5 | 52 |
| *LowCost Nav | 10 | 5 | 10 | - | 5 | 62 |
| *ELC | 10 | 5 | 10 | 2.5 | 5 | 67 |
| *CG System | 10 | 5 | 5 | 5 | 5 | 62 |
| *UltraCane | 10 | 5 | 10 | 5 | 5 | 72 |
| *Obs Avoid using Thresholding | 10 | 5 | 5 | 5 | 10 | 72 |
| *Obs Avoid using Haptics&Laser | 10 | 5 | 5 | 10 | 5 | 72 |
| *ComVis Sys | 10 | 10 | 5 | 10 | 10 | 92 |
| *Sili Eyes | - | 5 | - | 5 | 5 | 32 |
| *PF belt | - | 5 | - | 2.5 | 10 | 37 |
| *EyeRing | 10 | 10 | 5 | Specific case 10 | 5 | 82 |
| *FingReader | 10 | 10 | 5 | Specific case 10 | 5 | 82 |
| *Nav RGB-D | 10 | 5 | 5 | 5 | 5 | 62 |
| *Mobile Crowd Ass Nav | 10 | 5 | 10 | - | 5 | 62 |
| *DBG Crutch Based MSensors | 10 | 5 | 5 | 5 | 5 | 62 |
| *Ultra Ass Headset | 10 | 10 | 10 | 5 | 5 | 82 |
| *MobiDevice Improved VerticleResolution | 10 | 5 | 5 | 10 | 10 | 82 |
| *Ultrasonic for ObstDetectRec | 10 | 10 | 5 | 5 | 10 | 82 |
| *SUGAR System | 10 | 5 | 5 | 10 | 5 | 72 |
Evaluation of reviewed systems based on addition features that caused that limitations of each system.
| System Name/Weight/Type of Usage | Type of the Sensors | Accuracy | Analysis Type | Coverage | Measuring Angle | Cost | Limitation | Day/Night | Object Detection Range (Max/Min) | Classification Objects (Dynamic/Static) | Used Techniques for Detection, Recognition or Localization |
|---|---|---|---|---|---|---|---|---|---|---|---|
| Ultrasonic sensors | N/A | Real time | Outdoor (only areas have RFID tags) | N/A | High | The water sensor can’t detect the water if it is less than 0.5 deep. | Day | 1 m–1.5 m | Static | Ultrasonic technology | |
| 2 Ultrasonic | N/A | Real time | Outdoor | Each sensor has a cone angle of 15° | $1790 | The design of the system is uncomfortable due to the wood foundation which will be carried by the user most of the time as well as and the figures holes. | Day/Night | 2 m–3 m | Static | GPS, GSM, and GPRS | |
| Optical Sensors | Accurate results for user position | Real time | Outdoor | 6° of visual angle with (320 × 240 pixel) and 100° field of view with (640 × 480 pixel) | low | The system was tested on the function of the object’s avoidance technique. The system has not been tested or integrated with navigation systems to insure its performance; whether it will enhance the navigation systems as the authors promised or not is unknown. | Day | 2 m–10 m | Static/dynamic | Global Position System (GPS), Modified Geographical Information System (GIS) and vision based positioning | |
| iV-CAM | 80% | Real time | N/A | N/A | low | This device was tested only on the Thai banknotes and coins, and it is not capable of working on other currencies that have similar colors of banknotes or similar sizes of coins. | Day | Closed View | Static | RGB model | |
| Detective Camera | The corresponds are based on the feeling on the dorsal part of the tongue, (1,2,3,4) 100% | Real time | Outdoor | N/A | low | Antenna is not omni-directional. | Day/Night | N/A | Static | Tongue–Placed Electro tactile Display | |
| 3D CMOS sensor | 80% in range of 0.5 m–5 m and less than 80% with further distance | Real time | Indoor/outdoor | 64° in azimuth | N/A | Small detection range | Day/Night | 0.5 m–5 m | Static | Binaural Acoustic module | |
| None | N/A | Not-Real time | Outdoor | N/A | N/A | Collision of RFID | Day/Night | 1 m–3 m | Static | Ultra-high frequency (UHF) | |
| 3 Axial accelerometer sensors | Good accuracy within residential area, but not as in an urban environment | Real time | Outdoor | N/A | $138 | The accuracy of GPS receiver in high rise building is degraded. | Day | N/A | Static | GPS technology | |
| Ultrasonic sensor | N/A | Real time | Outdoor | N/A | N/A | It is a detector device for physical obstacles above the waist line but the navigation still relies on the blind person. | Day/Night | Close objects over the waistline | Static | Ultrasonic sensor technology | |
| Kinect sensor | N/A | Real time | Indoor | 180° | N/A | Only 49 Fuzzy rules were covered which cover 80 different configurations. | Day | 1.5 m–4.0 m | Static | The Canny filter for edge detection. | |
| Ultrasonic sensor (trans-receiver) | N/A | Real time | Indoor | 30° | N/A | Just an object detector | Day/Night | 5–150 cm | Static | Ultrasonic Technology | |
| Kinect’s depth camera | N/A | Real time | Indoor | Horizontal | N/A | The accuracy of Kinect depth image decreases when the distance between the scene and sensor increase. | Day | 0.8 m–4 m | Static/dynamic | Auto-adaptive | |
| Basely the system was built on the use of laser but the Novint Falcon has | N/A | Real time | Indoor | Horizontal 270° in front of chair | N/A | Precise location of obstacles and angles were difficult to determine. | Day | 20 m with 3 cm error | Static | Haptics and a Laser Rangefinder | |
| Monocular camera | High Accuracy | Real time | Indoor/outdoor | Angular field of camera view of 69° | Their fixed sizes of the image based on the category can make detecting the same object with different sizes a challenge. | Day | Up to 10 m | Static/Dynamic | Lucas–Kanade algorithm and RANSAC algorithm are used for detection. | ||
| 24-bit color sensor | N/A | Not-Real time | Not tested | N/A | N/A | A power supply meter reading needs to be installed to track the status. | Not tested | 2.5 cm–3.5 m | Static | GPS & GSM technology | |
| IR sensor | N/A | Not-Real Time | Outdoor | 360° over the blind’s waist | N/A | The detection range for this design is too small. | Not tested | Short | Static/dynamic | Infrared technology and GPS | |
| Atmel 8 bit microcontroller | N/A | Real time | Indoor/outdoor | Not Applicable | N/A | The system does not provide a real time video feedback. | Day | Close up view | Static | Roving Networks | |
| Atmel 8 bit microcontroller | 93.9% | Real time tactile feedback20 m processing time | Indoor/outdoor | Not Applicable | N/A | There is a real time response for the audio feedback, but there is a long stop between the instructions. Also, the system prototype contains two pieces one is the ring, the other is the computation element which need to be carried all the time by the user for I/0 speech, otherwise the user will not be able to receive the feedback. | Day | Close up view | Static | Roving Networks | |
| RGB-D sensor | 95% | Real time | Indoor | low | The effective of the infrared to the sunlight can negatively affect the performance of the system outdoors and during the day time. | Night | Up to 3 m using range information technique and from 3 m and further using the vision information | Static | RANdom Sample Consensus (RANSA) detection algorithm | ||
| Camera | 20.5% improvement in crowd sound for navigation | Real time | Indoor | N/A | N/A | The collected information is based on the volunteers’ availability. | Day/Night | N/A | Dynamic | Crowd sounding service through Goagle engine for navigation | |
| 3 Ultrasonic sensors | N/A | Real time | Outdoor | 30° detection range for 2 sensors, 80° detection range for overhead | N/A | The detection range is small. | Day | 0 m–2 m in front | Static | Ultrasonic distance measurement approach | |
| 4 Ultrasonic type (DYP-ME007) sensor obstacle detector | N/A | Real time | Indoor/outdoor | 60° between ultrasonic distance sensors | N/A | Limited directions are provided. | Day/Night | 3 cm–4 m | Static | Ultrasonic technology | |
| 2 retine-inspired dynamic vision sensors (DVS) | 99% ± object detection, 90% ± 8% horizontal localization, 96% ± 5.3% size discrimination | Real time | Indoor | N/A | low | The modules are very expensive. | Day | 0.5 m–8 m | Dynamic/static | Event-based algorithm | |
| 4 ultrasonic sensors (Maxsonar LV EZ-0) | N/A | Real time | Indoor/outdoor | ±40° | Low | The system cannot detect obstacles above waist level. | Day | 2 < R ≤ 5 m | Static/dynamic | Vision-based object detection module. | |
| Ultra-wide band Sensors(UWB) | High Accuracy | Real time | Indoor | N/A | N/A | Sensors would have to be deployed in every room. | Day | 50 m–60 m | static | UWB positioning technique |
* Not Available online: N/A.
Figure 42Systems’ evaluation presents the total score for each system.
Figure 43Features’ overview for each system.