| Literature DB >> 30646586 |
Pengbo Gao1, Yan Zhang2, Linhuan Zhang3, Ryozo Noguchi4, Tofael Ahamed5.
Abstract
Unmanned aerial vehicle (UAV)-based spraying systems have recently become important for the precision application of pesticides, using machine learning approaches. Therefore, the objective of this research was to develop a machine learning system that has the advantages of high computational speed and good accuracy for recognizing spray and non-spray areas for UAV-based sprayers. A machine learning system was developed by using the mutual subspace method (MSM) for images collected from a UAV. Two target lands: agricultural croplands and orchard areas, were considered in building two classifiers for distinguishing spray and non-spray areas. The field experiments were conducted in target areas to train and test the system by using a commercial UAV (DJI Phantom 3 Pro) with an onboard 4K camera. The images were collected from low (5 m) and high (15 m) altitudes for croplands and orchards, respectively. The recognition system was divided into offline and online systems. In the offline recognition system, 74.4% accuracy was obtained for the classifiers in recognizing spray and non-spray areas for croplands. In the case of orchards, the average classifier recognition accuracy of spray and non-spray areas was 77%. On the other hand, the online recognition system performance had an average accuracy of 65.1% for croplands, and 75.1% for orchards. The computational time for the online recognition system was minimal, with an average of 0.0031 s for classifier recognition. The developed machine learning system had an average recognition accuracy of 70%, which can be implemented in an autonomous UAV spray system for recognizing spray and non-spray areas for real-time applications.Entities:
Keywords: image classifiers; machine learning system; mutual subspace method; precision agriculture; recognition system
Year: 2019 PMID: 30646586 PMCID: PMC6359728 DOI: 10.3390/s19020313
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Subspace method (SM).
Figure 2Comparison between two sets of images using the mutual subspace method (MSM).
Figure 3The research framework establishing the classifiers and the MSM.
Training and testing with datasets classified into two categories for offline and online recognition systems.
| Targets | Data Sets | Training Image Numbers | Testing Image Numbers | |||
|---|---|---|---|---|---|---|
| Spray | Nonspray | Offline | Online | Offline | Online | |
| Carrot | 120 | 120 | First half (60 + 60) | All (120 + 120) | Last half (60 + 60) | New video |
| Cabbage | 198 | 198 | First half (99 + 99) | All (198 + 198) | Last half (99 + 99) | New video (298) |
| Onion | 107 | 107 | First half (53 + 53) | All (107 + 107) | Last half (54 + 54) | New video (204) |
| Chestnut | 97 | 97 | First half (48 + 48) | All | Last half (49 + 49) | New video (180) |
| Persimmon | 94 | 94 | First half (47 + 47) | All | Last half (47 + 47) | New video (210) |
| Trees and Structures | 118 | 118 | First half (59 + 59) | All (118 + 118) | Last half (59 + 59) | New video (141) |
Figure 4(a–l) Training and testing datasets for building the classifiers for recognizing spray areas and non-spray areas.
Figure 5Image sets in classifier recognition in the learning and recognition phases for MSM applications.
Accuracy analysis for the offline recognition system.
| True condition (offline recognition) | ||||
| Spray | Nonspray | ∑Total | ||
| Predicted condition | Spray | True Positive | False Positive | Total Positive |
| Nonspray | False Negative | True Negative | Total Negative | |
| Accuracy |
| |||
Figure 6Online recognition system for the classification of spraying, based on MSM classifiers.
Offline classifier recognition and accuracy analysis.
| True condition (offline recognition) | ||||||
| Location | Work patterns | Cropland | Orchard | |||
| Classifiers | Spray | Nonspray | Spray | Nonspray | ||
| Predicted condition | L1 | Spray | 74 | 21 | 35 | 9 |
| Nonspray | 16 | 79 | 13 | 31 | ||
| Accuracy | 80.5% | 75% | ||||
| L2 | Spray | 38 | 11 | 41 | 2 | |
| Nonspray | 18 | 31 | 10 | 33 | ||
| Accuracy | 70.4% | 86.1% | ||||
| L3 | Spray | 56 | 0 | 37 | 18 | |
| Nonspray | 31 | 25 | 15 | 40 | ||
| Accuracy | 72.3% | 70% | ||||
(L1: a farm with a combination of croplands and orchards, L2: a farm with different croplands with orchards, L3: a research farm with croplands and orchards)
Extended datasets for the training and testing of classifiers, using an offline recognition system.
| Croplands and Orchards | Data Sets | Training Image Numbers | Testing Image Numbers | Accuracy | |
|---|---|---|---|---|---|
| Spray | Nonspray | Offline | Offline | ||
| Carrot | 256 | 256 | First half (128 + 128) | Last half (128 + 128) | 73.79% |
| Cabbage | 440 | 440 | First half (220 + 220) | Last half (220 + 220) | 81.25% |
| Onion | 210 | 210 | First half (105 + 105) | Last half (105 + 105) | 66.32% |
| Chestnut | 224 | 224 | First half (112 + 112) | Last half (112 + 112) | 77.31% |
| Persimmon | 248 | 248 | First half (124 + 124) | Last half (124 + 124) | 70.94% |
| Trees and Structures | 216 | 216 | First half (108 + 108) | Last half (108 + 108) | 64.58% |
Figure 7(a–c) Online recognition performance of a classifier of croplands from a height of 5 m.
Figure 8(a–c) Online recognition performance of a classifier of orchards from a height of 15 m.
Online classifier recognition and accuracy analysis
| Croplands and Orchards | Flying Height | Accuracy | Recognition Time of Classifier |
|---|---|---|---|
| Carrot | 5 | 65.51 | 0.0031 |
| Cabbage | 5 | 60.88 | 0.0048 |
| Onion | 5 | 69.00 | 0.0031 |
| Chestnut | 15 | 69.10 | 0.0031 |
| Persimmon | 15 | 82.21 | 0.0031 |
| Trees and Structures | 15 | 74.10 | 0.0031 |