| Literature DB >> 32665596 |
Andrea Santangeli1,2,3, Yuxuan Chen4,5, Edward Kluen6,7, Raviteja Chirumamilla4,8, Juha Tiainen4,9, John Loehr4.
Abstract
In conservation, the use of unmanned aerial vehicles (drones) carrying various sensors and the use of deep learning are increasing, but they are typically used independently of each other. Untapping their large potential requires integrating these tools. We combine drone-borne thermal imaging with artificial intelligence to locate ground-nests of birds on agricultural land. We show, for the first time, that this semi-automated system can identify nests with a high performance. However, local weather, type of arable field and height of the drone can affect performance. The results' implications are particularly relevant to conservation practitioners working across sectors, such as biodiversity conservation and food production in farmland. Under a rapidly changing world, studies like this can help uncover the potential of technology for conservation and embrace cross-sectoral transformations from the onset; for example, by integrating nest detection within the precision agriculture system that heavily relies on drone-borne sensors.Entities:
Year: 2020 PMID: 32665596 PMCID: PMC7360548 DOI: 10.1038/s41598-020-67898-3
Source DB: PubMed Journal: Sci Rep ISSN: 2045-2322 Impact factor: 4.379
Figure 1Training performance as measured by the proportion of images correctly identified by the neural network in relation to the number of epochs. Performance is shown based on four metrics: Precision (A), recall (B), mAP (C) and F1 score (D). See “Methods” section for a detailed description of each of the performance metrics. Figure created in R software version 3.6.1 (www.r-project.org).
The relationship between probability of occurrence of false presences (A) and false absences (B) and a set of uncorrelated environmental covariates.
| β | SE | z | p | |
|---|---|---|---|---|
| Intercept | − 1.767 | 0.133 | 13.26 | < 0.001 |
| Cloud cover | − 0.005 | 0.001 | 3.63 | < 0.001 |
| Temperature | 0.028 | 0.011 | 2.50 | 0.013 |
| Wind speed | 0.020 | 0.018 | 1.09 | 0.276 |
| Substrate (un-ploughed) | − 0.199 | 0.089 | 2.24 | 0.025 |
| Drone height (25 m) | − 0.053 | 0.087 | 0.61 | 0.542 |
| Intercept | − 0.867 | 0.737 | 1.18 | 0.240 |
| Cloud cover | − 0.002 | 0.007 | 0.31 | 0.759 |
| Temperature | − 0.098 | 0.062 | 1.58 | 0.113 |
| Substrate (un-ploughed) | − 0.453 | 0.392 | 1.15 | 0.250 |
| Drone height (25 m) | 0.974 | 0.460 | 2.12 | 0.035 |
Effect size (β), standard error (SE), test statistics (z) as well as p values are derived from averaging across the set of best supported models (reported in Table S1) run separately for false presence and false absence (see “Methods” section for more details). For the two categorical variables with two classes each, results consider un-ploughed fields as a reference for the substrate variable (compared to ploughed substrate), and height of 25 m for the drone height variable (compared to 15 m height).
Figure 2The probability of occurrence of false presences, whereby the deep learning algorithm erroneously identified a nest in images without a nest, in relation to cloud cover (A), and air temperature (B). Lines depict mean probability and associated 95% confidence interval (grey shading) as derived from the best supported model obtained from model selection (results shown in Table 1). Data are shown by the ticks on the horizontal axes. Figure created in R software version 3.6.1 (www.r-project.org).
Figure 3A schematic representation of the integration of a system for nest location within the practice of precision agriculture which makes extensive use of drones carrying sensors to map soil humidity, need for fertilizer and other chemical input and so on across fields. We envisage that thermal sensors can be added or complemented to the existing set in order to acquire thermal images that are processed by artificial intelligence (AI) specifically trained to automatically locate nests from thermal images. The geo-location of identified nests will then be transferred to the tractor that can be automatically programmed to avoid the nest through field operations, e.g. by changing direction or lifting the sower/harvester. The satellite image is taken from Google maps (www.google.it/maps, map data: Google). Figure created in Microsoft Office PowerPoint 2016 (www.microsoft.com).
Figure 4Schematic representation of the key steps of this study, from extensive nest search at selected fields (A), to flying the drone carrying the thermal sensor along a pre-programmed route over the field (B), preparing the thermal images by extracting the coordinates of the box drawn around the nest (C), and finally applying a neural network deep learning algorithm to classify images as having or not having a nest (D). The satellite image in (A) and (B) is taken from Google maps (www.google.it/maps, map data: Google). Figure created in Microsoft Office PowerPoint 2016 (www.microsoft.com).