| Literature DB >> 29988466 |
Shichao Jin1,2, Yanjun Su1, Shang Gao1,2, Fangfang Wu1,2, Tianyu Hu1, Jin Liu1, Wenkai Li3, Dingchang Wang4, Shaojiang Chen4, Yuanxi Jiang1,5, Shuxin Pang1, Qinghua Guo1,2.
Abstract
The rapid development of light detection and ranging (Lidar) provides a promising way to obtain three-dimensional (3D) phenotype traits with its high ability of recording accurate 3D laser points. Recently, Lidar has been widely used to obtain phenotype data in the greenhouse and field with along other sensors. Individual maize segmentation is the prerequisite for high throughput phenotype data extraction at individual crop or leaf level, which is still a huge challenge. Deep learning, a state-of-the-art machine learning method, has shown high performance in object detection, classification, and segmentation. In this study, we proposed a method to combine deep leaning and regional growth algorithms to segment individual maize from terrestrial Lidar data. The scanned 3D points of the training site were sliced row and row with a fixed 3D window. Points within the window were compressed into deep images, which were used to train the Faster R-CNN (region-based convolutional neural network) model to learn the ability of detecting maize stem. Three sites of different planting densities were used to test the method. Each site was also sliced into many 3D windows, and the testing deep images were generated. The detected stem in the testing images can be mapped into 3D points, which were used as seed points for the regional growth algorithm to grow individual maize from bottom to up. The results showed that the method combing deep leaning and regional growth algorithms was promising in individual maize segmentation, and the values of r, p, and F of the three testing sites with different planting density were all over 0.9. Moreover, the height of the truly segmented maize was highly correlated to the manually measured height (R2> 0.9). This work shows the possibility of using deep leaning to solve the individual maize segmentation problem from Lidar data.Entities:
Keywords: Lidar (light detection and ranging); classification; deep learning; detection; phenotype; segmentation
Year: 2018 PMID: 29988466 PMCID: PMC6024748 DOI: 10.3389/fpls.2018.00866
Source DB: PubMed Journal: Front Plant Sci ISSN: 1664-462X Impact factor: 5.753
The information of the sensor used in this study.
| Sensor | FARO Focus3D X 330 HDR |
|---|---|
| Laser wavelength (nm) | 1550 |
| Laser beam divergence (mrad) | 0.19 |
| Field of view (°) | Horizontal: 360°; Vertical: 300° |
| Angular resolution (°) | Horizontal: 0.009°; Vertical: 0.009° |
| Detection range (m) | 0.6–130 m indoor or outdoor with upright incidence to a 90% reflective surface |
| Pulse rate (kHz) | 244 |
| Maximum scanning rate (Hz) | 97 |
| Scanning accuracy | 0.3 mm @ 10 m @ 90% reflectance |
| Scanner weight (kg) | 5.2 |
| Dimensions (mm) | 240 × 200 × 100 |
| Laser class | Laser class 1 |
| Beam diameter at exit (mm) | 2.25 |
The dataset information of the training site and the three testing sites of different planting densities.
| Number of maize | Area (m2) | Plant density (plants/m2) | Maize height (m) | ||||
|---|---|---|---|---|---|---|---|
| Minimum | Mean | Maximum | |||||
| Training data | 337 | 100.48 | 3.35 | 0.09 | 0.34 | 0.68 | |
| Testing data | Sparse | 62 | 23.05 | 2.69 | 0.13 | 0.32 | 0.49 |
| Moderate | 71 | 11.48 | 6.19 | 0.13 | 0.29 | 0.69 | |
| Dense | 88 | 9.81 | 8.96 | 0.14 | 0.35 | 0.73 | |
The accuracy assessments of the individual maize segmentation on the three testing datasets with different planting density.
| TP | FP | FN | R | P | F | |
|---|---|---|---|---|---|---|
| Sparse | 59 | 4 | 3 | 0.95 | 0.93 | 0.94 |
| Moderate | 66 | 2 | 5 | 0.93 | 0.97 | 0.95 |
| Dense | 81 | 6 | 7 | 0.92 | 0.93 | 0.93 |
| Overall | 206 | 12 | 15 | 0.93 | 0.94 | 0.94 |