| Literature DB >> 32585864 |
Balakrishnan Ramalingam1, Jia Yin1, Mohan Rajesh Elara1, Yokhesh Krishnasamy Tamilselvam2, Madan Mohan Rayguru1, M A Viraj J Muthugala1, Braulio Félix Gómez1.
Abstract
The role of mobile robots for cleaning and sanitation purposes is increasing worldwide. Disinfection and hygiene are two integral parts of any safe indoor environment, and these factors become more critical in COVID-19-like pandemic situations. Door handles are highly sensitive contact points that are prone to be contamination. Automation of the door-handle cleaning task is not only important for ensuring safety, but also to improve efficiency. This work proposes an AI-enabled framework for automating cleaning tasks through a Human Support Robot (HSR). The overall cleaning process involves mobile base motion, door-handle detection, and control of the HSR manipulator for the completion of the cleaning tasks. The detection part exploits a deep-learning technique to classify the image space, and provides a set of coordinates for the robot. The cooperative control between the spraying and wiping is developed in the Robotic Operating System. The control module uses the information obtained from the detection module to generate a task/operational space for the robot, along with evaluating the desired position to actuate the manipulators. The complete strategy is validated through numerical simulations, and experiments on a Toyota HSR platform.Entities:
Keywords: HSR; cleaning; deep learning; door handle cleaning; human service robot; object detection
Mesh:
Year: 2020 PMID: 32585864 PMCID: PMC7349910 DOI: 10.3390/s20123543
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1HSR platform with cleaning module.
Figure 2System architecture.
Figure 3CNN architecture.
Figure 4Disinfectant liquid spraying unit.
Figure 5Block diagram of experimental procedures.
Figure 6Offline test results.
Figure 7Localization result for test bed1 and test bed2.
Figure 8Real-time door-handle cleaning demonstration.
Figure 9Real-time door-handle detection—(1, 2 Test bed1), (3, 4 Test bed2).
Statistical measures for door-handle detection.
| Test | Lever Type Handle | Circle Type | Bar Type Handle | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Prec. | Recall |
| Accuracy | Prec. | Recall |
| Accuracy | Prec. | Recall |
| Accuracy | |
| offline | 97.2 | 95.8 | 95.1 | 95.5 | 95.5 | 93.8 | 93.5 | 94.3 | 96.5 | 94.4 | 93.9 | 93.9 |
| Real time | 91.2 | 90.6 | 89.7 | 90.4 | 92.9 | 91.7 | 91.3 | 92.0 | NA | NA | NA | NA |
Execution time analysis.
| Task | Execution Time |
|---|---|
| Inference time (offline—180 images) | 11.07 (s) |
| Test bed1 (4 doors) | 18 (min) |
| Test bed2 (3doors) | 10 (min) |
Comparison with other object detection framework.
| Object Detection Framework | Precision | Recall | F1 | Accuracy | Computation Time (s) |
|---|---|---|---|---|---|
| SSD MobileNet | 96.05 | 95.80 | 95.49 | 95.22 | 15.88 |
| SSD Inception | 97.55 | 97.13 | 97.07 | 97.00 | 26.03 |
| Proposed (180 images) | 97.20 | 95.8 | 95.1 | 95.5 | 11.07 |
Comparison with existing door-handle detection schemes.
| Case Study | Application | Algorithm | Detection Accuracy |
|---|---|---|---|
| Jauregi [ | Tartlo robot, door open task | circle hough transform + Oblique Classifier-1 | 85 |
| Liang et al. [ | Visually Impaired | Yolo V2 | 80.00 |
| Maurin et al. [ | door-handle open task, iRobot-ATR V-Jr | 7 × 7 × 12 multi-layer CNN + k-means clustering | 92.00 |
| Ellen et al. [ | Door open case study (stair robot) | 2d-sliding window | 93.20 |
| Proposed system | door handle cleaning | 16-layer CNN | 94.56 |