| Literature DB >> 27347972 |
Dionisio Andújar1, José Dorado2, César Fernández-Quintanilla3, Angela Ribeiro4.
Abstract
The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them.Entities:
Keywords: Kinect v2; maize; plant volume estimation; weed detection; weed/crop structure characterization
Year: 2016 PMID: 27347972 PMCID: PMC4970024 DOI: 10.3390/s16070972
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Schematic design of the system with the components integrated for maize-weed detection. (1) Portable gasoline generator; (2) laptop; (3) ATV; (4) Kinect v2 sensor; (5) support structure.
Figure 2(a) Some frames located in the experimental field; (b) example of a mesh obtained with the Kinect v2 sensor.
Figure 3Data processing structure of the Kinect sensor information.
Figure 4An example of maize isolation and removal by height selection, and an example of color selection for weed extraction.
Figure 5Linear regression between both the total volume that was estimated using the Kinect v2 sensor and the total biomass (a) and the maize volume with the maize biomass (b). The R2 denotes the correlation coefficient of the simple regression.
Figure 6Simple linear regression between the total weed volume that was calculated with the depth and color images and the weed biomass weight.
Confusion matrix of the canonical discriminant classification showing a percentage of correct group classifications for the three predefined groups.
| Predicted | |||
|---|---|---|---|
| Monocots | Dicots | Mixture | |
| Monocots | 53.8 | 7.7 | 38.5 |
| Dicots | 0.0 | 100.0 | 0.0 |
| Mixture | 50.0 | 0.0 | 50.0 |