Literature DB >> 36236546

LidSonic V2.0: A LiDAR and Deep-Learning-Based Green Assistive Edge Device to Enhance Mobility for the Visually Impaired.

Sahar Busaeed1, Iyad Katib2, Aiiad Albeshri2, Juan M Corchado3,4,5, Tan Yigitcanlar6, Rashid Mehmood7.   

Abstract

Over a billion people around the world are disabled, among whom 253 million are visually impaired or blind, and this number is greatly increasing due to ageing, chronic diseases, and poor environments and health. Despite many proposals, the current devices and systems lack maturity and do not completely fulfill user requirements and satisfaction. Increased research activity in this field is required in order to encourage the development, commercialization, and widespread acceptance of low-cost and affordable assistive technologies for visual impairment and other disabilities. This paper proposes a novel approach using a LiDAR with a servo motor and an ultrasonic sensor to collect data and predict objects using deep learning for environment perception and navigation. We adopted this approach using a pair of smart glasses, called LidSonic V2.0, to enable the identification of obstacles for the visually impaired. The LidSonic system consists of an Arduino Uno edge computing device integrated into the smart glasses and a smartphone app that transmits data via Bluetooth. Arduino gathers data, operates the sensors on the smart glasses, detects obstacles using simple data processing, and provides buzzer feedback to visually impaired users. The smartphone application collects data from Arduino, detects and classifies items in the spatial environment, and gives spoken feedback to the user on the detected objects. In comparison to image-processing-based glasses, LidSonic uses far less processing time and energy to classify obstacles using simple LiDAR data, according to several integer measurements. We comprehensively describe the proposed system's hardware and software design, having constructed their prototype implementations and tested them in real-world environments. Using the open platforms, WEKA and TensorFlow, the entire LidSonic system is built with affordable off-the-shelf sensors and a microcontroller board costing less than USD 80. Essentially, we provide designs of an inexpensive, miniature green device that can be built into, or mounted on, any pair of glasses or even a wheelchair to help the visually impaired. Our approach enables faster inference and decision-making using relatively low energy with smaller data sizes, as well as faster communications for edge, fog, and cloud computing.

Entities:  

Keywords:  Arduino Uno; LiDAR; assistive tools; deep learning; edge computing; green computing; obstacle detection; obstacle recognition; sensors; smart app; smart mobility; sustainability; ultrasonic; visually impaired

Mesh:

Year:  2022        PMID: 36236546      PMCID: PMC9570831          DOI: 10.3390/s22197435

Source DB:  PubMed          Journal:  Sensors (Basel)        ISSN: 1424-8220            Impact factor:   3.847


  17 in total

1.  Safe Local Navigation for Visually Impaired Users With a Time-of-Flight and Haptic Feedback Device.

Authors:  Robert K Katzschmann; Brandon Araki; Daniela Rus
Journal:  IEEE Trans Neural Syst Rehabil Eng       Date:  2018-03       Impact factor: 3.802

Review 2.  Why vision is important to how we navigate.

Authors:  Arne D Ekstrom
Journal:  Hippocampus       Date:  2015-04-02       Impact factor: 3.899

3.  Iktishaf+: A Big Data Tool with Automatic Labeling for Road Traffic Social Sensing and Event Detection Using Distributed Machine Learning.

Authors:  Ebtesam Alomari; Iyad Katib; Aiiad Albeshri; Tan Yigitcanlar; Rashid Mehmood
Journal:  Sensors (Basel)       Date:  2021-04-24       Impact factor: 3.576

4.  The National and Regional Prevalence Rates of Disability, Type, of Disability and Severity in Saudi Arabia-Analysis of 2016 Demographic Survey Data.

Authors:  Saad M Bindawas; Vishal Vennu
Journal:  Int J Environ Res Public Health       Date:  2018-02-28       Impact factor: 3.390

5.  Fuzzy Logic Type-2 Based Wireless Indoor Localization System for Navigation of Visually Impaired People in Buildings.

Authors:  Basem Al-Madani; Farid Orujov; Rytis Maskeliūnas; Robertas Damaševičius; Algimantas Venčkauskas
Journal:  Sensors (Basel)       Date:  2019-05-07       Impact factor: 3.576

6.  Blindness and the Reliability of Downwards Sensors to Avoid Obstacles: A Study with the EyeCane.

Authors:  Maxime Bleau; Samuel Paré; Ismaël Djerourou; Daniel R Chebat; Ron Kupers; Maurice Ptito
Journal:  Sensors (Basel)       Date:  2021-04-12       Impact factor: 3.576

7.  Spatial navigation with horizontally spatialized sounds in early and late blind individuals.

Authors:  Samuel Paré; Maxime Bleau; Ismaël Djerourou; Vincent Malotaux; Ron Kupers; Maurice Ptito
Journal:  PLoS One       Date:  2021-02-26       Impact factor: 3.240

8.  COVID-19: Detecting Government Pandemic Measures and Public Concerns from Twitter Arabic Data Using Distributed Machine Learning.

Authors:  Ebtesam Alomari; Iyad Katib; Aiiad Albeshri; Rashid Mehmood
Journal:  Int J Environ Res Public Health       Date:  2021-01-01       Impact factor: 3.390

9.  Improvements in the learnability of smartphone haptic interfaces for visually impaired users.

Authors:  F J González-Cañete; J L López Rodríguez; P M Galdón; A Díaz-Estrella
Journal:  PLoS One       Date:  2019-11-11       Impact factor: 3.240

10.  Imtidad: A Reference Architecture and a Case Study on Developing Distributed AI Services for Skin Disease Diagnosis over Cloud, Fog and Edge.

Authors:  Nourah Janbi; Rashid Mehmood; Iyad Katib; Aiiad Albeshri; Juan M Corchado; Tan Yigitcanlar
Journal:  Sensors (Basel)       Date:  2022-02-26       Impact factor: 3.576

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.