| Literature DB >> 30188901 |
Itamar Eliakim1, Zahi Cohen1, Gabor Kosa2, Yossi Yovel3,4.
Abstract
Echolocating bats rely on active sound emission (echolocation) for mapping novel environments and navigating through them. Many theoretical frameworks have been suggested to explain how they do so, but few attempts have been made to build an actual robot that mimics their abilities. Here, we present the 'Robat'-a fully autonomous bat-like terrestrial robot that relies on echolocation to move through a novel environment while mapping it solely based on sound. Using the echoes reflected from the environment, the Robat delineates the borders of objects it encounters, and classifies them using an artificial neural-network, thus creating a rich map of its environment. Unlike most previous attempts to apply sonar in robotics, we focus on a biological bat-like approach, which relies on a single emitter and two ears, and we apply a biological plausible signal processing approach to extract information about objects' position and identity.Entities:
Mesh:
Year: 2018 PMID: 30188901 PMCID: PMC6126821 DOI: 10.1371/journal.pcbi.1006406
Source DB: PubMed Journal: PLoS Comput Biol ISSN: 1553-734X Impact factor: 4.475
Fig 1The system—The Robat, signal processing and mapping.
(A) Image of the Robat. Insert shows the Robat’s sensory unit including a speaker and two receivers. (B) An example of a single echo received by the Robat’s two ears. Top row shows right and left ear echo spectrograms where the emitted signal and 3 consecutive echoes can be observed. The first and loudest peak is the emitted signal. Bottom row shows the correlation signals with the peaks that were detected as returning from the same object numbered in each ear (1-3). (C) The Robat’s acoustic mapping. Objects recognized by the Robat are shown as turquoise points. Yellow shading shows the inflation of objects into a map deliniating the borders of the route. Note that turquoise points ahead of the robot have not yet been inflated, because inflation does not occur at each acquisition. Green dots show the locations of echo acquisition points (every 0.5m). The three directions of acquisition are depicted by the three-parts beam.
Fig 2Mapping and obstacle avoidance.
(a) Robat’s mapping of the greenhouse overlaid on drone images. Black line depicts the trajectory taken by the Robat. Yellow shaded areas show the objects mapped by the Robat and the blue line shows their borders. Turquoise points depict the center of the objects as they have been localized by the Robat. (b) Passing a cylinder obstacle (D = 0.8m) that has been placed in the Roba’ts way. Yellow arrows show the trajectory taken by the Robat to pass the obstacle (doing so fully autonomously). Turquoise points depict points on the obstacle as they have been localized by the Robat. (c) Examples of object classification. Two correct classifications and two wrong ones are presented. Note that the ‘non-plant’ classified as a plant includes a bamboo mesh. Such mesh objects create echoes that have plant-like acoustic characteristics.
Classification performance on the test set, for the plant vs. non-plant task.
| True Label / Predicted Label | Plant | Non-Plant |
|---|---|---|
| Plant | 77% | 23% |
| Non-Plant | 42% | 58% |
Classification features.
| Feture Name | Total |
|---|---|
| ZCR | 23 |
| Energy | 46 |
| Entropy | 23 |
| Spectral Centroid | 23 |
| Spectral Spread | 23 |
| Spectral Flux | 23 |
| Spectral Rolloff | 23 |
| Chrome Vector | 299 |