| Literature DB >> 35371160 |
Negin Katal1, Michael Rzanny1, Patrick Mäder2,3, Jana Wäldchen1.
Abstract
Climate change represents one of the most critical threats to biodiversity with far-reaching consequences for species interactions, the functioning of ecosystems, or the assembly of biotic communities. Plant phenology research has gained increasing attention as the timing of periodic events in plants is strongly affected by seasonal and interannual climate variation. Recent technological development allowed us to gather invaluable data at a variety of spatial and ecological scales. The feasibility of phenological monitoring today and in the future depends heavily on developing tools capable of efficiently analyzing these enormous amounts of data. Deep Neural Networks learn representations from data with impressive accuracy and lead to significant breakthroughs in, e.g., image processing. This article is the first systematic literature review aiming to thoroughly analyze all primary studies on deep learning approaches in plant phenology research. In a multi-stage process, we selected 24 peer-reviewed studies published in the last five years (2016-2021). After carefully analyzing these studies, we describe the applied methods categorized according to the studied phenological stages, vegetation type, spatial scale, data acquisition- and deep learning methods. Furthermore, we identify and discuss research trends and highlight promising future directions. We present a systematic overview of previously applied methods on different tasks that can guide this emerging complex research field.Entities:
Keywords: PhenoCams; deep learning; drones; herbarium specimen; machine learning; phenology; phenology monitoring; remote sensing
Year: 2022 PMID: 35371160 PMCID: PMC8969581 DOI: 10.3389/fpls.2022.805738
Source DB: PubMed Journal: Front Plant Sci ISSN: 1664-462X Impact factor: 6.627
Figure 1Overview of methods monitoring phenology.
Figure 2Study selection process.
Figure 3Number of studies per year of publication. In 2021, we only reviewed publications up to September 2021.
Figure 4(A) Extent of the investigated vegetation types across primary studies. Studies that used herbarium materials are not included. (B) Overview of main phenological stages and the number of studies that investigated them. Some studies investigated several phenological stages.
Figure 5Utilization of different methods for acquiring training data across primary studies.
Overview of studies used digital repeat photography for phenological study with DL methodology.
|
|
| |||||
|---|---|---|---|---|---|---|
|
|
|
|
|
| ||
| Correia et al. ( | Forest | Bud | Under canopy | Individual | Automated | 47,607 |
| Kim et al. ( | Forest | Flower | Under canopy | Individual | Automated | 20,000 |
| Cao et al. ( | Forest | Leaf | Above canopy | Regional | Automated | 14,453 |
| Milicevic et al. ( | Plantation | Flower | Under canopy | Individual | Automated | 7,000 |
| Ganesh et al. ( | Plantation | Fruit | Under canopy | Individual | Manual | – |
| Wang et al. ( | Plantation | Flower | Under canopy | Individual | Automated | – |
| Wang et al. ( | Plantation | Flower | Under canopy | Individual | Automated | 1,126 |
| Pahalawatta et al. ( | Plantation | Flower | Off-site | Individual | Manual | 245 |
| Velumani et al. ( | Cropland | Wheat spike | Above crop | Individual | Automated | 40,500 |
| Yalcin ( | Cropland | 9 stages | Above crop | Individual | Automated | 2400 |
| Han et al. ( | Cropland | 10 stages | Above crop | Individual | Manual | 610 |
| Nogueira et al. ( | Grass-/shrubland | SOS | Above canopy | Regional | Automated | 432 |
Figure 6Examples of image capturing methods. (A) A wildlife camera was used to capture images for budburst classification in coniferous forests (Correia et al., 2020). (B) Images were taken manually on a plain background (Pahalawatta et al., 2020). (C) Images of apple flowers were collected by a mobile platform (Wang et al., 2020). (D) Digitalized herbarium specimens (Lorieul et al., 2019). (E) Close-up shots of certain areas of agriculture were taken automatically (Yalcin, 2017). (F) Aerial images were taken by high resolution UAVs imagery (Pearse et al., 2021). (G) Sentinel-2 and Worldview-2 satellite images (Wagner, 2021). (H) Camera installed on an 18 m tower using digital timestamps (Nogueira et al., 2019).
Overview of studies that used herbarium materials for phenological study with DL methodology.
|
|
|
|
|
|
|---|---|---|---|---|
| Lorieul et al. ( | Bud, flower, fruit, sporangia, cones | Individual | 163,233 | 7,782 |
| Davis et al. ( | Bud, flower, fruit | Individual | >3,000 | 6 |
| Goëau et al. ( | Bud, flower, fruit | Individual | 21 | 1 |
Overview of studies that used UAVs imagery for phenological study with DL methodology.
|
|
|
|
|
|
|---|---|---|---|---|
| Pearse et al. ( | Forest | Flower | Individual | RGB |
| Nogueira et al. ( | Grass-/shrubland | SOS | Regional | RGB |
| Yang et al. ( | Cropland | 6 growth stages | Regional | RGB, multispectral |
| Yang et al. ( | Cropland | 8 growth stages + Harvest time | Regional | RGB, multispectral |
Overview of studies that used satellite imagery for phenological study with DL methodology.
|
|
|
|
|
|
|---|---|---|---|---|
| Tian et al. ( | Mangrove | SOS, senescence | Regional | Landsat-5, 8 |
| Cai et al. ( | Cropland | SOS | Regional | Landsat-5, 7, 8 |
| Li et al. ( | Cropland | SOS, EOS | Regional | Sentinel-2, Landsat-8 |
| Xin et al. ( | Forest, Grassland | SOS, EOS | Regional | MODIS |
| Kim et al. ( | Forest | SOS | Regional | Sentinel-2 |
| Wagner ( | Forest | Flower | Regional | Sentinel-2 |
Figure 7Overview of all DL methods used for different types of (A) landuse, (B) image origin, and (C) type of phenology expression under study.
Overview of deep learning methods in classification tasks.
|
|
|
|
| |
|---|---|---|---|---|
|
|
|
|
|
|
| Han et al. ( | 10 stages | GCC, color features+SVM | ||
| Yalcin ( | 9 stages | AlexNet | Texture feature + Naive-Bayes | |
| Yang et al. ( | 8 stages + harvest time |
| Vegetation indices, VGG16, InceptionV3, ResNet50V2, InceptionResNetV2 | |
| Lorieul et al. ( | Fertile material | ResNet50 | – | 96.3 |
| Lorieul et al. ( | Flower | ResNet50 | – | 84.3 |
| Lorieul et al. ( | Fruit | ResNet50 | – | 80.5 |
| Lorieul et al. ( | 9 stages | ResNet50 | – | 43.4 |
| Wang et al. ( | 8 flower stages + distribution |
| YOLOv5 | – |
| Kim et al. ( | Flower |
| VGG16, ResNet50, ResNet101, MobileNet | |
| Milicevic et al. ( | Open - closed flower buds |
| VGG19, InceptionResNetV2, Xception, ResNet50, | |
| Velumani et al. ( | Wheat spikes | ResNet50 | – | 98.5 |
| Wagner ( | Species presence | VGG16 | – | 99.6 |
| Pearse et al. ( | Species presence |
| Texture feature+XG Boost |
The bold values indicate the best performing method and their respective performance.
Overview of deep learning methods in segmentation tasks.
|
|
|
|
|
|---|---|---|---|
| Wang et al. ( | Flower segmentation | Cust. FCN | 84.4 (F) |
| Davis et al. ( | Counting buds, flowers, fruits | Mask R-CNN | 92.0 (A) |
| Goëau et al. ( | Counting buds, flowers, fruits | Mask R-CNN | 77.9 (A) |
| Ganesh et al. ( | Fruit segmentation | Mask R-CNN | 88.7 (F) |
| Pahalawatta et al. ( | Open/close flower | Mask R-CNN | 84.3 (A) |
| Kim et al. ( | Landuse | U-Net | 75.0 (A) |
| Li et al. ( | Species presence | Temp. group attention netw. | 99.9 (A) |
| Tian et al. ( | Species presence | SAE | 96.1 (A) |
| Cai et al. ( | Species presence | DNN | 95.0 (A) |
| Nogueira et al. ( | Species presence | Cust. CNN | 99.8 (A) |
Primary studies report performance as accuracy (A) and F-score (F).
Overview of deep learning methods for regression problems.
|
|
|
|
|
|
|---|---|---|---|---|
| Xin et al. ( | SOS, EOS | Four-layer NN | 22.9, | |
| Cao et al. ( | Leaf phenology |
| AlexNet, VGG | |
| Yang et al. ( | Yield estimation |
| Vegetation indices |
Primary studies report performance as root mean square error (RMSE). The bold values indicate the best performing method and their respective performance.
Figure 8Diagram describing articles according to the defined categories.