| Literature DB >> 34996963 |
Guy Coleman1, William Salter2, Michael Walsh2.
Abstract
The use of a fallow phase is an important tool for maximizing crop yield potential in moisture limited agricultural environments, with a focus on removing weeds to optimize fallow efficiency. Repeated whole field herbicide treatments to control low-density weed populations is expensive and wasteful. Site-specific herbicide applications to low-density fallow weed populations is currently facilitated by proprietary, sensor-based spray booms. The use of image analysis for fallow weed detection is an opportunity to develop a system with potential for in-crop weed recognition. Here we present OpenWeedLocator (OWL), an open-source, low-cost and image-based device for fallow weed detection that improves accessibility to this technology for the weed control community. A comprehensive GitHub repository was developed, promoting community engagement with site-specific weed control methods. Validation of OWL as a low-cost tool was achieved using four, existing colour-based algorithms over seven fallow fields in New South Wales, Australia. The four algorithms were similarly effective in detecting weeds with average precision of 79% and recall of 52%. In individual transects up to 92% precision and 74% recall indicate the performance potential of OWL in fallow fields. OWL represents an opportunity to redefine the approach to weed detection by enabling community-driven technology development in agriculture.Entities:
Year: 2022 PMID: 34996963 PMCID: PMC8741824 DOI: 10.1038/s41598-021-03858-9
Source DB: PubMed Journal: Sci Rep ISSN: 2045-2322 Impact factor: 4.379
Figure 1Overview of the OpenWeedLocator (OWL) (a) software and (b) hardware, which combines weed detection with an actionable output. Detection is achieved with a Raspberry Pi 4 8 GB and HQ camera with actuation achieved using 12 V relays on the relay control board. A real time clock (RTC) module is used for accurate timekeeping. A 12 V DC source is required to power the system, with a voltage regulator providing 5 V power for the Raspberry Pi computer. A six-pin weatherproof connector is used to connect the OWL unit to the 12 V power supply and to connect the relays to four external devices. The buzzer and LEDs provide status information.
Figure 2Comparison of weed detection performance metrics precision and recall across ExG, NExG, HSV and ExHSV algorithms. Values presented are based on all seven field sites visited to indicate variability. Boxplots present the median and interquartile range with the boxes, and the range and outlier points (if more than 1.5 times the interquartile range) with the lines and points.
Summary of algorithm performance across seven field test sites during the day (n = 5) and night (n = 2) with artificial lighting using precision and recall.
| Algorithm | ExG | NExG | HSV | ExHSV | ||||
|---|---|---|---|---|---|---|---|---|
| Precision (%) | Recall (%) | Precision (%) | Recall (%) | Precision (%) | Recall (%) | Precision (%) | Recall (%) | |
| 18.0 | 27.7 | 75.8 | 22.6 | 76.3 | 71.6 | |||
| 46.5 | 90.2 | 53.3 | 87.7 | 50.5 | 54.4 | |||
| 91.8 | 95.1 | 47.8 | 96.6 | 39.9 | 47.7 | |||
| 70.2 | 90.9 | 36.1 | 91.0 | 48.8 | 43.4 | |||
| 98.0 | 39.3 | 98.0 | 20.2 | 23.1 | ||||
| 64.5 | 47.7 | 42.5 | 23.8 | 80.6 | 34.7 | |||
| 96.4 | 62.7 | – | – | 35.7 | 90.6 | |||
The highest result for each performance metric within each field is bolded.
Figure 3Representative weeds that were either correctly detected (green) or missed (red) at each of the seven field sites based on the ExHSV algorithm. Images of weeds shown were taken directly from concurrent video collected with a Samsung S8 phone camera and have not been rescaled, suggesting relative size is accurate.
Figure 4Representative images of the variable background and lighting conditions for seven image collection scenarios, (a) HEN1, (b) HEN2, (c) NIGHT1, (d) NIGHT2, (e) WAG1, (f) WAG2 and (g) COB1, used to evaluate the performance of colour-based weed detection.
Summary of field locations, weed species, background conditions, weed growth stage range and image collection speeds (n = 5) in fields used for video data collection and analysis.
| Field ID | Location | Coordinates | Light conditions | Background | Weeds present | Weed density (plants m−2) | Weed growth stages | Average speed (m s−1 ± SE) |
|---|---|---|---|---|---|---|---|---|
| HEN1 | Henty, NSW | − 35.517102, 147.034436 | Clear, morning full sun | Canola stubble, red–orange soil | annual sowthistle ( | 3.1 | 2-leaf to flowering | 1.14 ± 0.02 |
| HEN2 | Henty, NSW | − 35.517102, 147.034436 | Clear, afternoon full sun | Heavy wheat stubble, red soil | Volunteer wheat ( | 9.3 | 2-leaf to late tillering | 1.16 ± 0.01 |
| WAG1 | Wagga Wagga, NSW | − 35.056986, 147.351146 | Clear, morning full sun | Lupin stubble, red–orange soil | Volunteer narrowleaf lupins ( | 18.7 | 2-leaf to flowering | 1.14 ± 0.01 |
| WAG2 | Wagga Wagga, NSW | − 35.056986, 147.351146 | Clear, morning full sun | Grazed barley stubble | Volunteer barley ( | 3.3 | 2-leaf to 8-leaf | 1.24 ± 0.03 |
| COB1 | Cobbitty, NSW | − 34.021914, 150.662655 | Overcast | Dark brown soil, freshly tilled, no soil cover | Wild radish ( | 9.8 | Cotyledon to 6-leaf | 1.07 ± 0.01 |
| NIGHT1 | Culcairn, NSW | − 35.667692, 147.036800 | Night | Canola stubble | annual sowthistle, khaki weed ( | 9.6 | 2-leaf to flowering | 1.23 ± 0.01 |
| NIGT2 | Cobbitty, NSW | − 34.021914, 150.662655 | Night | Dark brown soil, freshly tilled, no soil cover | Wild radish, fumitory, large crabgrass, billygoat weed, stagger weed | 7.8 | Cotyledon to 6-leaf | 0.83 ± 0.01 |
Figure 5Overview of the frame-by-frame analysis process. Each 416 × 320 image is split into either red, green and blue (RGB) or hue, saturation and value (HSV) channels, and the ExG, NExG, ExHSV or HSV algorithm applied. A defined threshold is applied to the processed image followed by an adaptive threshold on the result (except HSV which is already binary) followed by contour detection and the generation of minimum enclosing rectangles for weed centre calculation.
Threshold parameters used for each of the four algorithms, where relevant.
| Parameters | Day | Night | ||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| ExG/NExG | ExHSV | HSV | ExG/NExG | ExHSV | HSV | |||||||
| Min | Max | Min | Max | Min | Max | Min | Max | Min | Max | Min | Max | |
| ExG | 13 | 200 | 13 | 200 | – | – | 29 | 200 | 29 | 200 | – | – |
| Hue | – | – | 30 | 92 | 35 | 84 | – | – | 30 | 92 | 45 | 80 |
| Saturation | – | – | 4 | 250 | 10 | 220 | – | – | 10 | 250 | 75 | 200 |
| Value | – | – | 15 | 250 | 50 | 200 | – | – | 60 | 250 | 46 | 240 |
| Object size (pixels) | 10 | – | 10 | – | 10 | – | 10 | – | 10 | – | 10 | – |
Values represent pixel intensities for zero-indexed 8-bit arrays with a range of 0–255. Pixel values that did not sit within the ranges were excluded, hence leaving only green pixels as the detected object. Separate thresholds were used for the day and night videos. A minimum object size was implemented to reduce noise and is based on the area of each detected object. Values were selected manually to optimize algorithm performance.