| Literature DB >> 32708707 |
Dražen Brščić1, Rhys Wyn Evans2, Matthias Rehm2, Takayuki Kanda1.
Abstract
We studied the use of a rotating multi-layer 3D Light Detection And Ranging (LiDAR) sensor (specifically the Velodyne HDL-32E) mounted on a social robot for the estimation of features of people around the robot. While LiDARs are often used for robot self-localization and people tracking, we were interested in the possibility of using them to estimate the people's features (states or attributes), which are important in human-robot interaction. In particular, we tested the estimation of the person's body orientation and their gender. As collecting data in the real world and labeling them is laborious and time consuming, we also looked into other ways for obtaining data for training the estimators: using simulations, or using LiDAR data collected in the lab. We trained convolutional neural network-based estimators and tested their performance on actual LiDAR measurements of people in a public space. The results show that with a rotating 3D LiDAR a usable estimate of the body angle can indeed be achieved (mean absolute error 33.5 ° ), and that using simulated data for training the estimators is effective. For estimating gender, the results are satisfactory (accuracy above 80%) when the person is close enough; however, simulated data do not work well and training needs to be done on actual people measurements.Entities:
Keywords: LiDAR; body angle estimation; gender recognition; social robotics
Mesh:
Year: 2020 PMID: 32708707 PMCID: PMC7411808 DOI: 10.3390/s20143964
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Rotating multi-layer laser scanner (Light Detection And Ranging (LiDAR)) used in this work.
Figure 2Using a simulator to collect artificial LiDAR measurements of people.
Figure 3Data collection using actual LiDAR sensor.
Figure 4Flow of feature estimation.
Figure 5Transformation of LiDAR point cloud into a 2D depth image.
Mean absolute error of the body angle estimate for different training and testing datasets.
| Testing Data | Lab | Real World | |||
|---|---|---|---|---|---|
| Instantaneous | Sequential | Instantaneous | Sequential | ||
| Training data | Simulated | 26.01 | 26.78 | 40.88 | 35.46 |
| Lab | - | - | 47.35 | 39.23 | |
| Simulated + Lab | - | - |
|
| |
|
| - | - |
|
| |
Figure 6Distribution of body angle mean absolute errors.
Figure 7Dependence of the body angle error on the distance to the person: solid line—instantaneous estimate; dashed line—sequential estimate. The figure also shows the distribution of distances in the testing dataset (real world dataset).
Accuracy of gender classification for different training and testing datasets.
| Testing Data | Lab | Real World | |||
|---|---|---|---|---|---|
| Instantaneous | Sequential | Instantaneous | Sequential | ||
| Training data | Simulated | 0.7543 | 0.7407 | 0.5383 | 0.5559 |
| Lab |
|
| |||
| Simulated + Lab | - | - | 0.6099 | 0.6513 | |
|
| - | - |
|
| |
Figure 8Dependence of the gender estimation accuracy on the distance to the person: solid line—instantaneous estimate; dashed line—sequential estimate. The figure also shows the distribution of the distances in the testing dataset (first day from real world dataset).
Figure 9Dependence of the gender estimation accuracy (instantaneous estimate) on the body angle of the person.