| Literature DB >> 35890881 |
Salam Dhou1, Ahmad Alnabulsi1, A R Al-Ali1, Mariam Arshi1, Fatima Darwish1, Sara Almaazmi1, Reem Alameeri1.
Abstract
Visually impaired people face many challenges that limit their ability to perform daily tasks and interact with the surrounding world. Navigating around places is one of the biggest challenges that face visually impaired people, especially those with complete loss of vision. As the Internet of Things (IoT) concept starts to play a major role in smart cities applications, visually impaired people can be one of the benefitted clients. In this paper, we propose a smart IoT-based mobile sensors unit that can be attached to an off-the-shelf cane, hereafter a smart cane, to facilitate independent movement for visually impaired people. The proposed mobile sensors unit consists of a six-axis accelerometer/gyro, ultrasonic sensors, GPS sensor, cameras, a digital motion processor and a single credit-card-sized single-board microcomputer. The unit is used to collect information about the cane user and the surrounding obstacles while on the move. An embedded machine learning algorithm is developed and stored in the microcomputer memory to identify the detected obstacles and alarm the user about their nature. In addition, in case of emergencies such as a cane fall, the unit alerts the cane user and their guardian. Moreover, a mobile application is developed to be used by the guardian to track the cane user via Google Maps using a mobile handset to ensure safety. To validate the system, a prototype was developed and tested.Entities:
Keywords: IoT; machine learning; sensors; smartphone; visually impaired people; walking assistants
Mesh:
Year: 2022 PMID: 35890881 PMCID: PMC9316426 DOI: 10.3390/s22145202
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Summary of the different types of walking assistant systems.
| Technology | Sensing/Capturing Devices | Computing Devices | Advantages | Disadvantages |
|---|---|---|---|---|
| Sensor-based | US, IR, motion, laser radar sensors, RFID sensors | Microcontroller, | Low cost, | Bulky, |
| Computer Vision–based | Depth, vision, RGB cameras, other sensors | Single board computer, laptop, PC | Detects and identifies objects, real-time data acquirement and training, lightweight | High cost, camera works poorly outdoors (because of sunlight), classification works poorly in crowded areas, slow response time due to processing |
| Mobile Phone–Based | Phone cameras, accelerometers, gyroscopes, magnetometers, GPS proximity sensors | Mobile phone processor | Low cost, efficient, compact, covers indoor and outdoor areas | Not configurable, performs poorly in elevated areas, short detection range |
Summary of visually impaired walking assistance systems.
| Ref. No. | Technology | Device | Features | Obstacle Identification | ||||
|---|---|---|---|---|---|---|---|---|
| Obstacles | Alerts | Remote | Fall | - | ||||
| Detect | Identify | |||||||
| [ | Sensor Based | Cane | Yes | No | Yes | No | No | - |
| [ | Sensor Based | Wearable | Yes | No | Yes | SMS | Yes | - |
| [ | Sensor Based | Cane | Yes | No | No | No | No | - |
| [ | Sensor Based | Cane | Yes | No | Yes | No | No | - |
| [ | Sensor Based | Hand-held | Yes | No | Yes | No | No | - |
| [ | Sensor Based | Shoe | Yes | No | Yes | No | No | - |
| [ | Sensor Based | Walker | Yes | No | Yes | Mobile App | No | - |
| [ | Computer Vision Based | Wearable | No | Yes | No | No | No | GLCM-based crosswalk identification |
| [ | Computer Vision + Mobile Phone Based | Smartphone | Yes | Yes | Yes | No | No | Neural network–based object identification |
| [ | Computer Vision + Mobile Phone Based | Smartphone | Yes | Yes | No | No | No | TensorFlow-based object identification |
| [ | Computer Vision Based | Cane | Yes | Yes | Yes | No | No | GMM-based object identification |
| [ | Computer Vision Based | Hand-held | No | Yes | No | No | No | CNN-based street sign identification |
| [ | Computer Vision Based | Hand-held | Yes | Yes | No | No | No | CNN- and LSTM-based object identification |
| [ | Sensor Based + Computer Vision Based | Cane and Cap | Yes | Yes | Yes | Mobile App | Yes | Mask R-CNN–based object identification |
| [ | Sensor Based + Computer Vision Based | Eyeglasses | Yes | Yes | Yes | No | No | TensorFlow-based object identification |
|
|
|
|
|
|
|
|
|
|
Figure 1System hardware architecture.
The specifications of the sensors used in the proposed system.
| Device Name | Full Scale Range | LSB Sensitivity |
|---|---|---|
| Accelerometer | ±2 g, ±4 g, ±8 g, ±16 g | 16,384 LSB/g, 8192 LSB/g, |
| Gyroscope | ±2500/s, ±5000/s, ±10,000/s, ±20,000/s | 131 LSB/°/s, 65.5 LSB/°/s, |
| Ultrasonic Sensor | 2 cm to 400 cm | 3 mm |
| Camera | 8 megapixel | 1080p × 30 |
| GPS | 2 m | 1–10 times/s (refresh rate) |
Figure 2Layered software architecture.
Figure 3Proposed obstacle classification algorithm.
Figure 4Sequence diagram of the proposed system.
Figure 5Actual prototype model.
Figure 6Ultrasonic (US) sensors and cameras.
Figure 7HOG feature descriptor before and after feature extraction.
Performance metrics for machine learning algorithms used in model 1.
| Classification Algorithm | Decision Tree | Naïve Bayes | SVM | K-Nearest Neighbor | |
|---|---|---|---|---|---|
| Performance Metrics | Accuracy | 0.89 | 0.93 | 0.99 | 0.98 |
| Recall | 0.91 | 0.93 | 0.99 | 0.98 | |
| Precision | 0.91 | 0.94 | 0.99 | 0.98 | |
| F-Score | 0.91 | 0.93 | 0.99 | 0.98 | |
Performance metrics for machine learning algorithms used in model 2.
| Classification Algorithm | Naïve Bayes | SVM | K-Nearest | |
|---|---|---|---|---|
| Performance Metrics | Accuracy | 0.83 | 1.00 | 0.83 |
| Recall | 0.83 | 1.00 | 0.83 | |
| Precision | 0.88 | 1.00 | 0.88 | |
| F-Score | 0.83 | 1.00 | 0.83 | |
Figure 8Confusion Matrix of Model 1 using SVM (doors vs. upward stairs).
Figure 9Confusion Matrix of Model 2 using SVM (downward stairs vs. hollow pits).
Figure 10GPS tracking in mobile app.
Figure 11Notifications in the mobile app.