| Literature DB >> 35808427 |
Archana Semwal1, Lee Ming Jun Melvin1, Rajesh Elara Mohan1, Balakrishnan Ramalingam1, Thejus Pathmakumar1.
Abstract
Mosquito-borne diseases can pose serious risks to human health. Therefore, mosquito surveillance and control programs are essential for the wellbeing of the community. Further, human-assisted mosquito surveillance and population mapping methods are time-consuming, labor-intensive, and require skilled manpower. This work presents an AI-enabled mosquito surveillance and population mapping framework using our in-house-developed robot, named 'Dragonfly', which uses the You Only Look Once (YOLO) V4 Deep Neural Network (DNN) algorithm and a two-dimensional (2D) environment map generated by the robot. The Dragonfly robot was designed with a differential drive mechanism and a mosquito trapping module to attract mosquitoes in the environment. The YOLO V4 was trained with three mosquito classes, namely Aedes aegypti, Aedes albopictus, and Culex, to detect and classify the mosquito breeds from the mosquito glue trap. The efficiency of the mosquito surveillance framework was determined in terms of mosquito classification accuracy and detection confidence level on offline and real-time field tests in a garden, drain perimeter area, and covered car parking area. The experimental results show that the trained YOLO V4 DNN model detects and classifies the mosquito classes with an 88% confidence level on offline mosquito test image datasets and scores an average of an 82% confidence level on the real-time field trial. Further, to generate the mosquito population map, the detection results are fused in the robot's 2D map, which will help to understand mosquito population dynamics and species distribution.Entities:
Keywords: computer vision; deep learning; mapping; mosquito surveillance; robot
Mesh:
Year: 2022 PMID: 35808427 PMCID: PMC9269550 DOI: 10.3390/s22134921
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Figure 1Overview diagram of proposed framework.
Figure 2Dragonfly robot.
Figure 3System architecture.
Figure 4Mosquito trap unit.
Figure 5YOLO V4 block diagram.
CSPDarknet53 backbone.
| Layer Details | Input Dimensions | |
|---|---|---|
| Conv |
| |
| Conv |
| |
| Conv |
| |
| Conv |
|
|
| Conv |
| |
| Residual | - | |
| Conv |
| |
| Conv |
|
|
| Conv |
| |
| Residual | - | |
| Conv |
| |
| Conv |
|
|
| Conv |
| |
| Residual | - | |
| Conv |
| |
| Conv |
|
|
| Conv |
| |
| Residual | - | |
| Conv |
| |
| Conv |
|
|
| Conv |
| |
| Residual | - | |
| Avgpool | - | |
| Softmax |
|
Figure 6Sample of augmented image. (a) Rotation. (b) Scale. (c) Translation. (d) Blur. (e) Enhance color. (f) Flip. (g) Brightness. (h) Cutout. (i) Shear.
Augmentation types and settings.
| Augmentation Type | Augmentation Setting |
|---|---|
| Scaling | 0.5× to 1.5× |
| Rotation | from −45 degree to +45 degree |
| Translation | |
| Horizontal Flip | .flip the image horizontally |
| Color Enhancing | contrast (from 0.5× to 1.5×) |
| Blurring | Gaussian Blur (from sigma 1.0× to 3.0×) |
| Brightness | from 0.5× to 1.5× |
| Shear | |
| Cutout | 1 to 3 squares up to 35% of pixel size |
Figure 7Mosquito surveillance framework’s offline test results. (a) Aedes Aegypti. (b) Aedes Aegypti. (c) Aedes Albopictus. (d) Culex. (e) Aedes Albopictus and Aedes Aegypti. (f) Aedes Aegypti. (g) Aedes Albopictus. (h) Culex.
Statistical measure results of mosquito surveillance framework (offline).
| Class | Before Augmentation | After Augmentation | ||||||
|---|---|---|---|---|---|---|---|---|
| Precision | Recall |
| Accuracy | Precision | Recall |
| Accuracy | |
| Aedes Aegypti | 73.08 | 73.44 | 75.57 | 76.67 | 92.25 | 94.24 | 93.24 | 92.67 |
| Aedes Albopictus | 73.44 | 81.03 | 77.05 | 77.33 | 89.51 | 94.81 | 92.09 | 90.00 |
| Culex | 75.57 | 83.90 | 79.52 | 78.67 | 93.66 | 94.33 | 94.00 | 94.00 |
Figure 8Testing environment. (a) ‘Dragonfly’ robot in garden (morning). (b) ‘Dragonfly’ robot in SUTD campus (evening).
Figure 9Real-time mosquito glue trap (a) Glue Trap 1 (Morning) (b) Glue Trap 2 (Evening).
Figure 10Mosquito surveillance framework’s online test results. (a) Culex, Aedes Aegypti, Aedes Albopictus. (b) Aedes Aegypti. (c) Aedes Aegypti. (d) Culex.
Statistical measure results of mosquito surveillance framework (online).
| Class | Precision | Recall |
| Accuracy | Average Accuracy |
|---|---|---|---|---|---|
| Aedes Aegypti | 86.30 | 88.59 | 87.43 | 87.67 | 87.99 |
| Aedes Albopictus | 84.90 | 86.54 | 85.71 | 86.68 | |
| Culex | 88.52 | 88.85 | 88.68 | 89.62 |
Figure 11Mosquito population mapping.
Statistical results of mosquito glue trap.
| Class | Early Morning | Night |
|---|---|---|
| Aedes Aegypti | 54 | 37 |
| Aedes Albopictus | 61 | 47 |
| Culex | 39 | 64 |
Comparison with other models.
| Model | Class | Precision | Recall |
| Accuracy | Average Accuracy | Inference Speed (FPS) |
|---|---|---|---|---|---|---|---|
| YOLOv3 + CSPDarknet53 (ours) | Aedes Aegypti | 88.29 | 96.41 | 92.17 | 93.61 | 93.20 | 57 |
| Aedes Albopictus | 92.83 | 97.53 | 95.12 | 90.70 | |||
| Culex | 96.89 | 98.29 | 97.59 | 95.29 | |||
| YOLOv3 + ResNet101 | Aedes Aegypti | 92.28 | 91.13 | 93.45 | 91.67 | 91.00 | 24 |
| Aedes Albopictus | 88.77 | 94.40 | 91.50 | 89.33 | |||
| Culex | 91.52 | 93.84 | 92.67 | 92.00 | |||
| YOLOv3 + MobileNetv2 | Aedes Aegypti | 81.48 | 88.00 | 84.62 | 83.33 | 84.22 | 112 |
| Aedes Albopictus | 82.59 | 88.14 | 85.28 | 84.33 | |||
| Culex | 83.58 | 89.80 | 86.58 | 85.00 | |||
| SSD + MobileNetv2 | Aedes Aegypti | 82.90 | 87.79 | 85.28 | 84.67 | 83.78 | 98 |
| Aedes Albopictus | 81.18 | 88.35 | 84.62 | 83.00 | |||
| Culex | 81.78 | 87.65 | 84.61 | 83.67 |
Comparison analysis with existing object detection frameworks.
| Case Studies | Inspection Type | Algorithm | Classes | Accuracy |
|---|---|---|---|---|
| Rustam et al. [ | Offline | ETC | 2 | 99.2 |
| Kittichai et al. [ | Offline | Two YOLO V3 | 5 | 99 |
| Yin et al. [ | Offline | 1D-CNN | 5 | 93 |
| Goodwin et al. [ | Offline | CNN | 67 | 97.04 |
| Proposed framework | Real-time with Dragonfly | YOLO V3 + CSPDarknet53 | 3 | 87.99 |