| Literature DB >> 33287100 |
Michael Buzzy1, Vaishnavi Thesma1, Mohammadreza Davoodi1, Javad Mohammadpour Velni1.
Abstract
The use of deep neural networks (DNNs) in plant phenotyping has recently received considerable attention. By using DNNs, valuable insights into plant traits can be readily achieved. While these networks have made considerable advances in plant phenotyping, the results are processed too slowly to allow for real-time decision-making. Therefore, being able to perform plant phenotyping computations in real-time has become a critical part of precision agriculture and agricultural informatics. In this work, we utilize state-of-the-art object detection networks to accurately detect, count, and localize plant leaves in real-time. Our work includes the creation of an annotated dataset of Arabidopsis plants captured using Cannon Rebel XS camera. These images and annotations have been complied and made publicly available. This dataset is then fed into a Tiny-YOLOv3 network for training. The Tiny-YOLOv3 network is then able to converge and accurately perform real-time localization and counting of the leaves. We also create a simple robotics platform based on an Android phone and iRobot create2 to demonstrate the real-time capabilities of the network in the greenhouse. Additionally, a performance comparison is conducted between Tiny-YOLOv3 and Faster R-CNN. Unlike Tiny-YOLOv3, which is a single network that does localization and identification in a single pass, the Faster R-CNN network requires two steps to do localization and identification. While with Tiny-YOLOv3, inference time, F1 Score, and false positive rate (FPR) are improved compared to Faster R-CNN, other measures such as difference in count (DiC) and AP are worsened. Specifically, for our implementation of Tiny-YOLOv3, the inference time is under 0.01 s, the F1 Score is over 0.94, and the FPR is around 24%. Last, transfer learning using Tiny-YOLOv3 to detect larger leaves on a model trained only on smaller leaves is implemented. The main contributions of the paper are in creating dataset (shared with the research community), as well as the trained Tiny-YOLOv3 network for leaf localization and counting.Entities:
Keywords: You Only Look Once (YOLO) network; deep learning; plant leaf counting; real-time decision-making
Year: 2020 PMID: 33287100 PMCID: PMC7730908 DOI: 10.3390/s20236896
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1You Only Look Once (YOLO) framework taking an image as input into a deep convolutional neural network and outputs the leaf detection, where the bottom part of the figure consists of a diagram of the YOLO network architecture.
Figure 2Deployed platform for data acquisition and real-time data processing.
Figure 3The general pipeline of our proposed architecture from generating the dataset to training the model. Different blocks are explained with more details in the ensuing section.
Figure 4Data ingest station.
Figure 5One sample of plant and its annotation from our generated dataset.
Figure 6The leaf counting output.
Figure 7Scatter plot comparison between true leaf count vs. estimated leaf count from YOLO model.
Network evaluation metrics.
| Metric | Tiny-YOLOv3 | Faster R-CNN |
|---|---|---|
| DiC | 0.25 | 0.0556 |
|
| 0.8056 | 1.2778 |
| MSE | 2.0833 | 2.8889 |
| %Agreement | 56% | 27.78% |
| AP (@.5) | 0.583 | 0.600 |
| Accuracy | 0.88846 | 0.83088 |
| Precision | 0.97059 | 0.91129 |
| F1 Score | 0.94467 | 0.89866 |
| TPR | 91.304% | 90.4% |
| FPR | 24.138% | 47.826% |
| Inference time (s) | 0.009225 | 0.917535 |
Source domain network evaluation metrics.
| Metric | Tiny-YOLOv3 |
|---|---|
|
| 0.575 |
| MSE | 1.075 |
| TPR (%) | 93.4% |
| FPR (%) | 11.7% |
| F1 Score | 0.961 |
Network evaluation metrics for testing source and target differences.
| Metric | Tiny-YOLOv3 |
|---|---|
|
| 0.938 |
| MSE | 1.788 |
| TPR (%) | 91% |
| FPR (%) | 23% |
| F1 Score | 0.94 |
Target domain network evaluation metrics.
| Metric | Tiny-YOLOv3 |
|---|---|
|
| 1.15 |
| MSE | 1.15 |
| TPR (%) | 87% |
| FPR (%) | 5% |
| F1 Score | 0.93 |