Literature DB >> 1874528

A blind mobility aid modeled after echolocation of bats.

T Ifukube1, T Sasaki, C Peng.   

Abstract

A new model of a mobility aid for the blind was designed using microprocessor and ultrasonic devices. This mobility aid was evaluated based on psychophysical experiments. In this model, a downswept FM ultrasound signal is emitted from a transmitting array with broad directional characteristics in order to detect obstacles. The ultrasound reflections from the obstacles are picked up by a two-channel receiver. The frequency of the emitted ultrasound is swept from 70 to 40 kHz within 1 ms, so it has almost the same characteristics as the ultrasound a bat produces for echolocation. The frequency of the reflected ultrasound wave is down converted by about 50:1 by using a microcomputer with A/D and D/A converters. These audible waves are then presented binaurally through earphones. In this method obstacles may be perceived as localized sound images corresponding to the direction and the size of the obstacles. From the results of psychophysical experiments, it was found that downswept FM ultrasound was superior for the recognition of small obstacles compared to other ultrasonic schemes. With it a blind person can recognize a 1-mm-diameter wire. It was also proved that the blind could discriminate between several obstacles at the same time without any virtual images. This mobility aid, modeled after the bat's echolocation system, is very effective at detecting small obstacles placed in front of the head.

Entities:  

Mesh:

Year:  1991        PMID: 1874528     DOI: 10.1109/10.81565

Source DB:  PubMed          Journal:  IEEE Trans Biomed Eng        ISSN: 0018-9294            Impact factor:   4.538


  9 in total

1.  A Device for Human Ultrasonic Echolocation.

Authors:  Jascha Sohl-Dickstein; Santani Teng; Benjamin M Gaub; Chris C Rodgers; Crystal Li; Michael R DeWeese; Nicol S Harper
Journal:  IEEE Trans Biomed Eng       Date:  2015-01-16       Impact factor: 4.538

Review 2.  On the role of auditory feedback in robot-assisted movement training after stroke: review of the literature.

Authors:  Giulio Rosati; Antonio Rodà; Federico Avanzini; Stefano Masiero
Journal:  Comput Intell Neurosci       Date:  2013-12-08

3.  Stereosonic vision: Exploring visual-to-auditory sensory substitution mappings in an immersive virtual reality navigation paradigm.

Authors:  Daniela Massiceti; Stephen Lloyd Hicks; Joram Jacob van Rheede
Journal:  PLoS One       Date:  2018-07-05       Impact factor: 3.240

4.  Visual Echolocation Concept for the Colorophone Sensory Substitution Device Using Virtual Reality.

Authors:  Patrycja Bizoń-Angov; Dominik Osiński; Michał Wierzchoń; Jarosław Konieczny
Journal:  Sensors (Basel)       Date:  2021-01-01       Impact factor: 3.576

Review 5.  Update on Potentially Zoonotic Viruses of European Bats.

Authors:  Claudia Kohl; Andreas Nitsche; Andreas Kurth
Journal:  Vaccines (Basel)       Date:  2021-06-23

6.  People's Ability to Detect Objects Using Click-Based Echolocation: A Direct Comparison between Mouth-Clicks and Clicks Made by a Loudspeaker.

Authors:  Lore Thaler; Josefina Castillo-Serrano
Journal:  PLoS One       Date:  2016-05-02       Impact factor: 3.240

Review 7.  Designing sensory-substitution devices: Principles, pitfalls and potential1.

Authors:  Árni Kristjánsson; Alin Moldoveanu; Ómar I Jóhannesson; Oana Balan; Simone Spagnol; Vigdís Vala Valgeirsdóttir; Rúnar Unnthorsson
Journal:  Restor Neurol Neurosci       Date:  2016-09-21       Impact factor: 2.406

Review 8.  Acoustic Sensors for Air and Surface Navigation Applications.

Authors:  Rohan Kapoor; Subramanian Ramasamy; Alessandro Gardi; Ron Van Schyndel; Roberto Sabatini
Journal:  Sensors (Basel)       Date:  2018-02-07       Impact factor: 3.576

9.  A Comparative Study in Real-Time Scene Sonification for Visually Impaired People.

Authors:  Weijian Hu; Kaiwei Wang; Kailun Yang; Ruiqi Cheng; Yaozu Ye; Lei Sun; Zhijie Xu
Journal:  Sensors (Basel)       Date:  2020-06-05       Impact factor: 3.576

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.