Literature DB >> 35967193

UAV-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat.

Shuaipeng Fei1, Muhammad Adeel Hassan2,3, Yonggui Xiao2, Xin Su4, Zhen Chen1, Qian Cheng1, Fuyi Duan1, Riqiang Chen5, Yuntao Ma6.   

Abstract

Early prediction of grain yield helps scientists to make better breeding decisions for wheat. Use of machine learning (ML) methods for fusion of unmanned aerial vehicle (UAV)-based multi-sensor data can improve the prediction accuracy of crop yield. For this, five ML algorithms including Cubist, support vector machine (SVM), deep neural network (DNN), ridge regression (RR) and random forest (RF) were used for multi-sensor data fusion and ensemble learning for grain yield prediction in wheat. A set of thirty wheat cultivars and breeding lines were grown under three irrigation treatments i.e., light, moderate and high irrigation treatments to evaluate the yield prediction capabilities of a low-cost multi-sensor (RGB, multi-spectral and thermal infrared) UAV platform. Multi-sensor data fusion-based yield prediction showed higher accuracy compared to individual-sensor data in each ML model. The coefficient of determination (R 2) values for Cubist, SVM, DNN and RR models regarding grain yield prediction were observed from 0.527 to 0.670. Moreover, the results of ensemble learning through integrating the above models illustrated further increase in accuracy. The predictions of ensemble learning showed high R 2 values up to 0.692, which was higher as compared to individual ML models across the multi-sensor data. Root mean square error (RMSE), residual prediction deviation (RPD) and ratio of prediction performance to inter-quartile range (RPIQ) were calculated to be 0.916 t ha-1, 1.771 and 2.602, respectively. The results proved that low altitude UAV-based multi-sensor data can be used for early grain yield prediction using data fusion and an ensemble learning framework with high accuracy. This high-throughput phenotyping approach is valuable for improving the efficiency of selection in large breeding activities. Supplementary Information: The online version contains supplementary material available at 10.1007/s11119-022-09938-8.
© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022.

Entities:  

Keywords:  Data fusion; Machine learning; Phenotyping; Unmanned aerial vehicle; Wheat

Year:  2022        PMID: 35967193      PMCID: PMC9362526          DOI: 10.1007/s11119-022-09938-8

Source DB:  PubMed          Journal:  Precis Agric        ISSN: 1385-2256            Impact factor:   5.767


Introduction

Wheat provides approximately 20% of the food to fulfill calories and protein requirements of 4.5 billion people (Nigro et al., 2014). The crop breeder’s major goal is to produce diverse germplasm which could have resistance to climate change and have high-yielding attributes (Shafiee et al., 2021). Present methods that enable breeders to estimate grain yield through secondary traits at early plant development stages, are destructive, low throughput, costly and time consuming (Lee et al., 2018; Montesinos-López et al., 2017; Shafiee et al., 2021). Recently, use of unmanned aerial vehicles (UAVs)-based optical sensors have been introduced as a low-cost, efficient, high throughput phenotyping platform (HTPP) for estimation of important attributes of crops (Hassan et al., 2018; Yang et al., 2020). This would increase the selection intensity and accuracy to obtain high genetic gain for grain yield (Rutkoski et al., 2016). Despite successful phenotyping of several crop attributes through UAV-based information, there are significant limitations in using a single sensor to estimate particular traits. Recently, multi-sensor data fusion of RGB, multi-spectral (MS) and thermal information has been used to improve the prediction accuracy of important crop traits such as chlorophyll content, above ground biomass, leaf area index (LAI), nitrogen level and yield (Feng et al., 2020a; Liu et al., 2021; Maimaitijiang et al., 2017, 2020). Multi-sensor data fusion by integrating the benefits of spectral, spatial, structural and temperature information from multiple sensors can improve estimates of plant traits (Maimaitijiang et al., 2020). UAV-based multi-sensor platforms are able to collect large amounts of high-dimensional data in a short period of time, which poses a challenge to modeling methods (Montesinos-López et al., 2017). The rapid development of computer science has promoted the advancement of machine learning (ML) algorithms, which has become a hot topic in quantitative remote sensing. ML methods such as ridge regression (RR), Gaussian process (GP), deep neural networks (DNN), random forest (RF) and support vector machine (SVM) are being used to build predictive models for plant traits using multiple types of remote sensing data as input (Fu et al., 2019; Jin et al., 2020; Matese & Di Gennaro, 2021). ML methods have increased the prediction accuracy and robustness of UAV-based sensor data sets. However, the optimal ML model for estimating crop traits varies both spatially and temporally. For example, although RF was shown to be the best model for estimating wheat biomass (Han et al., 2019; Wang et al., 2016), it was found to be less accurate for evaluating alfalfa yield using hyperspectral data (Feng et al., 2020b). A similar limitation has also been reported in the use of extreme learning machine (ELM) for estimating crop traits. Maimaitijiang et al. (2017) reported that ELM was the best model for estimating soybean traits, with significantly improved accuracy over partial least squares regression (PLSR) and SVM. However, in a recent study, ELM accuracy was lower than PLSR and RF when estimating sorghum chlorophyll (Bhadra et al., 2020). This suggests that the models that have been well-trained could perform poorly for estimation of plant parameters in other species and under different growth environments. This is most likely because all of these regression techniques are based on an individual prediction model and are susceptible to overfitting when working with less training data (Pal, 2007). Ensemble learning models provide better predictive performance than individual models. Stacking regression is a powerful ensemble learning technique that employs diverse learners and increases their differences to enhance ultimate accuracy (Feng et al., 2020b). The condition of diversity ensures the ability to provide information complementarity between different base learners, and is key to ensure that final models achieve correct results. The stacking method has been used increasingly in the field of precision agriculture. It has achieved higher accuracy than the best-performing individual model to estimate winter wheat yield (Fei et al., 2021), tobacco photosynthetic capacity (Fu et al., 2019) and alfalfa yield (Feng et al., 2020b). To date, no studies have been reported to predict winter wheat yield with multiple sensors combined with the stacking ensemble learning method. Taking all this into consideration, the main objectives of the study were (1) to evaluate UAV-based multi-sensor data fusion for wheat yield prediction at grain filling stage and (2) to develop an ensemble learning framework to improve yield prediction accuracy of ML models.

Materials and methods

Experiment location and design

The experiment was conducted at a research site of Chinese Academy of Agricultural Sciences (113° 45′ 40′′ E, 35° 8′ 11′′ N) in Xinxiang, Henan province, China. Thirty cultivars which have been released in the last few decades in China's Yellow and Huai Valleys Winter Wheat Zone (YHVWWZ) were used in this study. Cultivars were cultivated in plots with dimensions of 11.2 m2, 8 m in length and 1.4 m in width (Fig. 1). Each plot represented one cultivar with six rows spaced at 0.20 m. A randomized block design was implemented with three irrigation treatments (light irrigation, moderate irrigation and high irrigation) and replications were adopted for each treatment with a total number of 180 plots. The irrigation was done at different growth stages through a large movable sprinkler irrigation device. Detailed information on irrigation and environmental conditions (sunshine duration, precipitation and maximum temperature) for the winter wheat growing season in 2019–2020 at the experimental location is given in Fig. 2. Field fertilization, pest and disease management were all maintained at optimal levels according to local agricultural standards.
Fig. 1

Experimental design

Fig. 2

The profile of meteorological variables during the wheat growing season in 2019–2020 and the volume of sprinkler irrigation. a Sunshine duration, b max temperature, c precipitation, and d irrigation volume and time point for the three irrigation treatments. Meteorological data was gathered from local weather stations

Experimental design The profile of meteorological variables during the wheat growing season in 2019–2020 and the volume of sprinkler irrigation. a Sunshine duration, b max temperature, c precipitation, and d irrigation volume and time point for the three irrigation treatments. Meteorological data was gathered from local weather stations

Grain yield measurement

All the cultivated plots were harvested using a plot combine harvester in early June. The wheat grains from each plot were collected in plastic mesh bags for drying and weighed at a moisture content of approximately 12.5%. The t-test results revealed that irrigation had a significant effect on grain yield (P < 0.001) (Fig. 3).
Fig. 3

Violin diagram and t-test of measured grain yield under three irrigation treatments. ***Indicates significant at the 0.001 level

Violin diagram and t-test of measured grain yield under three irrigation treatments. ***Indicates significant at the 0.001 level

UAV-based image acquisition with multi-sensor

During the trials, an M210 (SZ DJI Technology Co., Shenzhen, China) equipped with a Red-Edge MX MS camera (MicaSense Inc., Seattle, USA), a Zenmuse XT2 camera (SZ DJI Technology Co., Shenzhen, China) with RGB and thermal infrared (TIR) lenses was used for multi-sensor data acquisition (Fig. 4). The detailed information of the sensors is given in Table 1. The aerial surveys were carried out at the grain filling stage due to the proven high accuracy of yield predictions during this period (Guan et al., 2017; Hassan et al., 2019; Qader et al., 2018).
Fig. 4

UAV systems and integrated sensors. a DJI M210 platform, b cameras integrated in the UAV platform, c and d are the wheat growth status during multi-sensor data acquisition

Table 1

Detailed parameters for the sensors installed on the UAV

Camera nameSensor typeBandWavelengthImage resolution
Red-Edge MXMulti-spectralBlue475 nm1280 × 960
Green560 nm1280 × 960
Red668 nm1280 × 960
Red-edge717 nm1280 × 960
Near infrared842 nm1280 × 960
Zenmuse XT2ThermalThermal infrared7.5–13.5 μm640 × 512
Zenmuse XT2RGBR G B4000 × 3000
UAV systems and integrated sensors. a DJI M210 platform, b cameras integrated in the UAV platform, c and d are the wheat growth status during multi-sensor data acquisition Detailed parameters for the sensors installed on the UAV The Red-Edge MX has five spectral lenses: blue, green, red, red-edge and near-infrared. The sunlight sensor of the Red-Edge MX camera automatically adjusts the ambient light effects to reduce the inaccuracy of MS images (Hassan et al., 2019). Each monochrome sensor captured the same resolution (1280 × 960) image with a 10 nm bandwidth for the red and red-edge band (half maximum bandwidth), 20 nm bandwidth for blue and green and 40 nm bandwidth for the infrared. The radiometric calibration was done before and after each flight using a calibration board in order to translate the DN value of the MS data into reflectance. The Zenmuse XT2 has two lenses i.e., TIR lens (640 × 512 pixels) and a RGB lens (4000 × 3000 pixels). Temperature observations in 7.5–13.5 μm spectral region are recorded with a 5 °C thermal sensitivity using the TIR lens. During the flights, a portable thermometer was used to measure the surface temperature of boards for calibration of thermal images. The DJI ground station software was utilized as an automated flight control system, letting users create custom missions and establish their own air routes. Flight missions were carried out from 11.00 to 13.00 under clear sky. All flights were flown at 30 m height to get high quality photos, with 85% front and 80% side image overlap for all cameras. Each sensor was featured with a built-in GNSS module that was used to get the location information of the images. Co-ordinates of ground control points (GCPs) were recorded with millimeter precision using a differential global navigation satellite system.

Image processing

The post-imaging process for data extraction is shown in Fig. 5. The images from the three sensors were pre-processed in the Pix4D mapper (Pix4D, Lausanne, Switzerland) to generate ortho-mosaics, digital surface model (DSM) and to implement radiometric calibration. The important steps of this procedure included geolocating the images, importing GCPs, aligning the images, building dense point cloud, DSM and ortho-mosaic and calibrating radiometric information (Han et al., 2019). Dense point clouds were generated using the structure from motion (SfM) method in Pix4Dmapper along with photogrammetric workflow. The ortho-mosaic images of each sensor were segmented into 180 polygon shapes with given IDs identifying the various plots to retrieve relevant information for each plot in the field. ArcMap 10.5 (Environmental Systems Research Institute, Inc., Redlands, USA) was used to create polygon forms. To minimize the impact of marginal effects, the edge of each plot was omitted while creating the shp file. Corresponding images and a polygonal shapefile layer of plot borders with individual IDs were all submitted to ENVI 5.3 (Exelis Visual Information Solutions, Inc., Boulder, USA) for feature extraction. The average of all the extracted pixel values of each plot was used as the corresponding feature.
Fig. 5

A workflow diagram of data acquisition, data processing, and feature extraction. MS multi-spectral images, TIR thermal infrared images, DSM digital surface model

A workflow diagram of data acquisition, data processing, and feature extraction. MS multi-spectral images, TIR thermal infrared images, DSM digital surface model For the MS images, radiometric calibration was utilized to enhance the radiometric quality of the data and convert the DN values to reflectance by utilizing radiometric calibration images with known reflectance values. For thermal image calibration, the DN values of the blackboard were extracted using ArcMap 10.5. A linear relationship (T = 0.2813*DN-295.98) was established between the actual temperature and the DN of the blackboard. The temperature distribution for the whole experiment was then calculated using this equation with the raster calculator in ArcMap 10.5. Figure 6a shows the scatter plot of the measured temperature of the blackboard vs. the temperature derived from the thermal image of the blackboard. The results of a t-test showed that there was a significant difference in canopy temperature between different irrigation treatments (Fig. 6b).
Fig. 6

a Scatter plot of estimated and measured temperature of calibration board, b violin diagram and t-test of estimated canopy temperature, c scatter plot of estimated and measured crop height, and d violin diagram and t-test of estimated crop height. ***Indicates significant at the 0.001 level. LI light irrigation, MI moderate irrigation, HI high irrigation

a Scatter plot of estimated and measured temperature of calibration board, b violin diagram and t-test of estimated canopy temperature, c scatter plot of estimated and measured crop height, and d violin diagram and t-test of estimated crop height. ***Indicates significant at the 0.001 level. LI light irrigation, MI moderate irrigation, HI high irrigation

Crop height, texture and vegetation index extraction

The features extracted from each sensor image are shown in Table 2. Crop height model (CHM), which has been extensively used to extract crop height information from various crops (Bendig et al., 2015), was used to calculate crop height. The UAV-based RGB images were acquired prior to plant emergence on October 24, 2019, in order to generate a bare-soil digital elevation model (DEM) including photogrammetric 3D point clouds. A distinct 3D point cloud model depicting a digital surface model (DSM) of all objects on the ground was created using RGB images acquired on April 20, 2020. A CHM was obtained by subtracting DSM and DEM pixel-by-pixel. To assess the accuracy of CHM, ten plants from each plot were selected for crop height measurement and averaged to represent the heights of the plot high correlation (R2 = 0.84) was observed between UAV and ground-based crop height (Fig. 6c). The distribution of UAV-based crop height reflects different irrigation treatments in the wheat field. Significant differences (P < 0.001) were found in UAV-based crop height under different irrigation treatments (Fig. 6d).
Table 2

Definitions of the features derived from various sensors

SensorFeatureFormulationReferences
RGBColor intensityINT = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{R}+\mathrm{G}+\mathrm{B})/3$$\end{document}(R+G+B)/3Ahmad and Reid (1996)
Kawashima indexIKAW = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{R}-\mathrm{B})/(\mathrm{R}+\mathrm{B})$$\end{document}(R-B)/(R+B)Kawashima and Nakatani (1998)
Principal component analysis indexIPCA = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$0.994|\mathrm{R}-\mathrm{B}|+0.961|\mathrm{G}-\mathrm{B}|+0.914|\mathrm{G}-\mathrm{R}|$$\end{document}0.994|R-B|+0.961|G-B|+0.914|G-R|Saberioon et al. (2014)
Excess red indexExR = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1.4\mathrm{R}-\mathrm{G}$$\end{document}1.4R-GMeyer and Neto (2008)
Excess green indexExG = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2\mathrm{G}-\mathrm{R}-\mathrm{B}$$\end{document}2G-R-BWoebbecke et al., (1995)
Excess green minus excess red indexExGR = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2\mathrm{G}-\mathrm{R}-\mathrm{B}-(1.4\mathrm{R}-\mathrm{G})$$\end{document}2G-R-B-(1.4R-G)Meyer and Neto (2008)
Modified Green Red Vegetation IndexMGRVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$({\mathrm{G}}^{2}-{\mathrm{R}}^{2})/({\mathrm{G}}^{2}+{\mathrm{R}}^{2})$$\end{document}(G2-R2)/(G2+R2)Bendig et al. (2015)
Red Green Blue Vegetation IndexRGBVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$({\mathrm{G}}^{2}-\mathrm{B}*\mathrm{R})/({\mathrm{G}}^{2}+\mathrm{B}*\mathrm{R})$$\end{document}(G2-BR)/(G2+BR)Bendig et al. (2015)
Crop heightDSM-DEM/
Gray-level co-occurrence matrixME, VA, HO, CO, DI, EN, SE, CORHaralick and Shanmugam (1973)
MSNormalized Difference Vegetation IndexNDVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{NIR}-\mathrm{R})/(\mathrm{NIR}+\mathrm{R})$$\end{document}(NIR-R)/(NIR+R)Rouse et al (1974)
Green-NDVIGNDVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{NIR}-\mathrm{G})/(\mathrm{NIR}+\mathrm{G})$$\end{document}(NIR-G)/(NIR+G)Gitelson et al. (1996)
Ratio Vegetation IndexRVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{NIR}/\mathrm{R})$$\end{document}(NIR/R)Tucker (1979)
Normalized difference red-edge indexNDREI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{NIR}-\mathrm{RE})/(\mathrm{NIR}+\mathrm{RE})$$\end{document}(NIR-RE)/(NIR+RE)Barnes et al. (2000)
Enhanced Vegetation IndexEVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2.5*(\mathrm{NIR}-\mathrm{R})/(1+\mathrm{NIR}-2.4*\mathrm{R})$$\end{document}2.5(NIR-R)/(1+NIR-2.4R)Huete et al. (2002)
Optimized Soil-Adjusted Vegetation IndexOSAVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{NIR}-\mathrm{R})/(\mathrm{NIR}-\mathrm{R}+0.16)$$\end{document}(NIR-R)/(NIR-R+0.16)Rondeaux et al. (1996)
Modified chlorophyll absorption in reflectance indexMCARI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$[(\mathrm{RE}-\mathrm{R})-0.2*(\mathrm{RE}-\mathrm{G})]*(\mathrm{RE}/\mathrm{R})$$\end{document}[(RE-R)-0.2(RE-G)](RE/R)Daughtry et al. (2000)
Transformed chlorophyll absorption in reflectance indexTCARI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$3*[(RE-R)-0.2*(RE-G)*(RE/R)]$$\end{document}3[(RE-R)-0.2(RE-G)(RE/R)]Haboudane et al. (2002)
Nitrogen Reflectance IndexNRI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{G}-\mathrm{R})/(\mathrm{G}+\mathrm{R})$$\end{document}(G-R)/(G+R)Schleicher et al. (2001)
Transformational Vegetation IndexTVI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sqrt{\mathrm{NDVI}+0.5}$$\end{document}NDVI+0.5Broge and Leblanc (2001)
Modified Simple Ratio IndexMSR = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$[\left({NIR}_{ }/{R}_{ }\right)-1]/\sqrt{{\mathrm{NIR}}_{ }/{\mathrm{R}}_{ }+1}$$\end{document}[NIR/R-1]/NIR/R+1Chen (1996)
Structure Insensitive Pigment IndexSIPI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{NIR}-\mathrm{B})/(\mathrm{NIR}+\mathrm{B})$$\end{document}(NIR-B)/(NIR+B)Penuelas et al. (1995)
Plant Senescence Reflectance IndexPSRI = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(\mathrm{R}-\mathrm{B})/\mathrm{NIR}$$\end{document}(R-B)/NIRMerzlyak et al. (1999)
Chlorophyll Index Red-EdgeCIRE = \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathrm{NIR}/\mathrm{RE}-1$$\end{document}NIR/RE-1Gitelson et al. (2003)
MCARI/OSAVIMCARI/OSAVIDaughtry et al. (2000)
TCARI/OSAVITCARI/OSAVIHaboudane et al. (2002)
Gray-level co-occurrence matrixME, VA, HO, CO, DI, EN, SE, CORHaralick and Shanmugam (1973)
TIRCanopy temperature/
Gray-level co-occurrence matrixME, VA, HO, CO, DI, EN, SE, CORHaralick and Shanmugam (1973)

Multi-spectral vegetation indices were calculated from the reflectance of each band, and the RGB vegetation indices were calculated from the DN value of each band

MS multi-spectral, TIR thermal infrared, DSM digital surface model, DEM digital elevation model, ME mean, VA variance, HO homogeneity, CO contrast, DI dissimilarity, EN entropy, SE second moment, CO correlation

Definitions of the features derived from various sensors Multi-spectral vegetation indices were calculated from the reflectance of each band, and the RGB vegetation indices were calculated from the DN value of each band MS multi-spectral, TIR thermal infrared, DSM digital surface model, DEM digital elevation model, ME mean, VA variance, HO homogeneity, CO contrast, DI dissimilarity, EN entropy, SE second moment, CO correlation Texture features were extracted from the MS based blue, green, red, red-edge, NIR bands and RGB-based crop height and thermal-based canopy temperature. The widely utilized gray level co-occurrence matrix (GLCM) was chosen to investigate the yield prediction with texture information. The ENVI 5.3 software was used to calculate GLCM-based texture features, including mean (ME), variance (VA), homogeneity (HO), contrast (CO), dissimilarity (DI), entropy (EN), second moment (SE) and correlation (COR). Vegetation indices (VIs) are commonly employed to measure crop attributes in agricultural fields. In this study, 24 VIs that have been found to be highly linked with crop traits were used (Table 2). These VIs were calculated from DN values of each band in the RGB images or reflectance of each band in the MS images.

Ensemble learning framework

Five widely used machine learning (ML) methods including Cubist (Quinlan, 1992), support vector machine (SVM) (Cortes & Vapnik, 1995), deep neural network (DNN) (Qiu et al., 2014), ridge regression (RR) (Hoerl & Kennard, 1970) and random forest (RF) (Breiman, 2001), were utilized as base learners in this study to explore yield prediction accuracy of ensemble learning and multi-sensor data fusion. Appropriate hyper-parameters play a crucial role in the successful application of ML algorithms; the candidate hyper-parameters of these five algorithms are shown in Table 3.
Table 3

Hyperparameters of machine learning methods

Machine learning methodHyperparameters
CubistCommittees: 1 to 25; neighbors: 1 to 9
SVMKernel function: Radial basis function; Sigma: from 0.01 to 0.02 with increments of 0.0001; C: from 0.5 to 0.7 with increments of 0.01
DNNUnits: from 10 to 100 with increments of 10; epochs: from 2 to 120 with increments of 2; hidden layers: 1, 2, 3 and 4; regularization method: dropout; activation function: rectified linear activation unit function
RRK: from 0 to 0.02 with increments of 0.001
RFntree: from 1000 to 10,000 with increments of 500; mtry: from 1 to 100 with increments of 2

SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest

Hyperparameters of machine learning methods SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest Stacking regression is an ensemble learning model proposed by Wolpert (1992) to blend the predictors and improve prediction accuracy (Fei et al., 2021). To validate the stacking ensemble learning (Fig. 7a), all of the data were initially separated into training and test sets in a 4:1 ratio. Each of the five ML methods was used to do a five-fold outer cross-validation (CV) on the training set. The training data were randomly and uniformly divided into five equal sets. Four fifths of which were used for outer CV training and one for outer CV validation, and the process was cycled five times, all of the data were utilized as either training samples or validation samples. At the same time, the hyper-parameters of each ML algorithm were being screened. The outer CV training data were randomly divided into 10 equal halves. Nine of the ten aliquots were used for inner CV training and one for inner CV validation, and the process was repeated ten times (Fig. 7b). All hyper-parameter combinations of each ML were submitted to a full ten-fold inner CV, with the hyper-parameter combination with the best average accuracy being sent to the outer CV training data. The five ML models trained by outer CV were tested for model accuracy on the initial test sets, and an out-of-sample prediction matrix was constructed on the outer CV validation set. After completing the five-fold CV, five sets of prediction results for the initial test set were generated for each model and averaged to generate the test set prediction matrix for further utilization as the secondary learner's test set. The RR model was employed as secondary learner to combine the prediction capability of multiple base learners. In level 2, the five-fold CV was also used with the out-of-sample prediction matrix to train the RR (StRR) model. After running the model on the test set prediction matrix, five sets of predicted values were generated, and the final predicted values were obtained by averaging them. The division of the original data into training and test sets in this study was carried out 80 times, and the same division method was used for different data sources or data fusions. In addition, the same split of the five-fold CV was used for different ML models in the same partitioning of the training and test sets, so that the prediction accuracy of different methods can be fairly compared.
Fig. 7

A workflow of a multi-sensor data fusion and ensemble learning, and b outer and inner cross-validation. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, CV cross-validation, P and p model predictions at different modeling stages

A workflow of a multi-sensor data fusion and ensemble learning, and b outer and inner cross-validation. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, CV cross-validation, P and p model predictions at different modeling stages

Accuracy evaluation parameters

For accuracy evaluation, coefficient of determination (R2), root mean square error (RMSE), residual prediction deviation (RPD) and ratio of prediction performance to interquartile range (RPIQ), were used to quantitatively estimate the prediction accuracy of ML models to predict grain yield. The lower the RMSE and the higher the R2, RPD and RPIQ, the better the indication of the model's predictive power. Equations 1–4 were used for estimation of R2, RMSE, RPD and RPIQ, respectively. The formulae for calculating the above parameters are as follows:where and are the measured and the predicted grain yield, respectively. is the mean of measured grain yield and n is the total number of testing samples. SD is the standard deviation of the measured grain yield. IQ is the interquartile spacing of the measured grain yield. To limit the effect of chance, the original data were divided into training and test sets 80 times. Each division of the ML model then created five tests during the five-fold CV procedure, for a total of 400 tests after 80 divisions. The average of the accuracy parameters of these test results was used as the final prediction accuracy evaluation metric.

Collinearity analysis

The variance inflation factor (VIF) is the relevant and frequently used quantity that can be consulted to examine individual predictors for potentially strong contributions to multicollinearity (Marcoulides & Raykov, 2019). In this study, VIF was used to represent the strength of collinearity between the individual sensor data output predictions and the individual ML output predictions. The multiple linear regression model assumes a dependent variable (Draper, 1998), denoted as y (grain yield in this study), is related as follows to a given set of independent variables, denoted as x1,..., x:where a, a..., a are regression coefficients and l is the intercept. When some of the predictors are involved in considerable linear relationships among themselves, standard errors for one or more individual partial regression coefficients can be unduly inflated. This tends to produce findings of possible lack of unique significance for substantively important regressors, in the context of other independent variables. The formula for calculating VIF is as follows:where denotes the R2 index when the ith explanatory variable is regressed on the remaining independent variables. In general, a multi-collinearity problem occurs when the VIF value is greater than 10 (Marcoulides & Raykov, 2019).

Results

Performance of models for grain yield prediction

Wheat yield prediction was done using Cubist, SVM, DNN, RR and RF regression algorithms using extracted features from three types of sensors (RGB, MS and TIR) and their different combinations (Table 4, Figs. 8 and 9). For individual sensor-based yield prediction, the prediction values of RF were high when employed on MS (R2 = 0.509) and TIR (R2 = 0.599) data sets. DNN yielded highest with R2 value of 0.606 for RGB data. The prediction accuracy of most ML models has been significantly improved after fusing multi-sensor features. Data fusion results of dual sensors showed that RF performed high for both RGB + TIR (R2 = 0.670) and MS + TIR (R2 = 0.629) data fusion. DNN was the best-performing model for data fusion of RGB + MS with R2 value of 0.648. In data fusion of three sensors, DNN achieved the best predictive accuracy (R2 = 0.670). All four ML algorithms Cubist, SVM, DNN and RR except RF, achieved higher prediction values in data fusion of three sensors than dual-sensor and individual-sensor.
Table 4

Test accuracy statistics of different models for grain yield prediction (the accuracy parameters in this table are the average of 400 test results)

SensorMetricBase learnerSecondary learner
CubistSVMDNNRRRFStRR
RGBR20.5140.5970.6060.5560.6050.624
RMSE (t ha−1)1.1491.0461.0451.1031.0341.016
RPD1.4271.5591.5621.4811.5801.606
RPIQ2.0812.2752.2792.1602.3062.345
MSR20.4980.5020.4890.4770.5090.532
RMSE (t ha−1)1.1641.1491.1651.1881.1401.120
RPD1.4001.4131.3931.3681.4241.449
RPIQ2.0432.0612.0331.9982.0782.117
TIRR20.5290.5530.5780.5630.5990.617
RMSE (t ha−1)1.1331.1021.0791.1461.0381.026
RPD1.4491.4861.5171.4401.5791.594
RPIQ2.1122.1682.2132.1002.3032.325
RGB + MSR20.5410.6380.6480.6050.6220.662
RMSE (t ha−1)1.1140.9820.9901.0431.0090.960
RPD1.4641.6521.6371.5561.6081.690
RPIQ2.1452.4222.4012.2842.3602.479
RGB + TIRR20.5270.6150.6310.6030.6700.671
RMSE (t ha−1)1.1431.0281.0151.0450.9550.951
RPD1.4351.5871.6081.5631.7111.718
RPIQ2.0932.3152.3462.2802.4982.508
MS + TIRR20.5430.5910.5960.5920.6290.640
RMSE (t ha−1)1.1161.0481.0521.0670.9980.991
RPD1.4631.5531.5461.5241.6331.643
RPIQ2.1372.2692.2592.2282.3882.402
RGB + MS + TIRR20.5630.6660.6700.6300.6650.692
RMSE (t ha−1)1.0920.9490.9641.0060.9560.916
RPD1.4941.7091.6811.6121.6981.771
RPIQ2.1932.5112.4702.3692.4962.602

MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner

Fig. 8

The statistical distribution of the prediction accuracy of individual machine learning and ensemble learning for grain yield prediction using individual sensor data in the modeling test phase. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner

Fig. 9

The statistical distributions of the prediction accuracy of individual machine learning and ensemble learning for grain yield prediction using multi-sensor data fusion in the modeling test phase. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner

Test accuracy statistics of different models for grain yield prediction (the accuracy parameters in this table are the average of 400 test results) MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner The statistical distribution of the prediction accuracy of individual machine learning and ensemble learning for grain yield prediction using individual sensor data in the modeling test phase. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner The statistical distributions of the prediction accuracy of individual machine learning and ensemble learning for grain yield prediction using multi-sensor data fusion in the modeling test phase. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner To combine the prediction capacity of individual ML models, RR was utilized as secondary learner of stacking regression. The results (Table 4, Figs. 8 and 9) revealed that the prediction accuracy of ensemble models was observed to be higher than individual ML models (Table 4). Compared to the best-performing individual ML models for individual sensor data, the ensemble learning improved the R2 value from 0.605 to 0.624 for RGB, 0.509 to 0.532 for MS and 0.599 to 0.616 for TIR. In data fusion, the prediction accuracy of ensemble learning was also observed to be higher than each of the individual models (Table 4). In data fusion of all three sensors, ensemble learning showed the highest prediction accuracy (R2 = 0.692, RMSE = 0.916, RPD = 1.771 and RPIQ = 2.602), which is a significant increase over the best-performing DNN (R2 = 0.670, RMSE = 0.964, RPD = 1.681 and RPIQ = 2.470) in the individual models. In Figs. 8 and 9, boxplots of R2 and RMSE for 400 tests of various models showed obvious fluctuations in parameter accuracy with a wide range among the individual ML models compared to ensemble learning. Similarly, after data fusion, the fluctuation of parameters accuracy for each ML model was reduced. Figure 10 shows the distribution of the regression coefficient of each ML model within the secondary learner (RR). A larger regression coefficient indicated a higher weight within the ensemble procedure. The results showed greater regression coefficients by secondary learner (RR) for individual models with high prediction accuracy. This suggested that the performance of the ensemble model was strongly associated with the high-performance of individual models.
Fig. 10

The distribution of coefficients within the level-2 models (ridge regression). MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner

The distribution of coefficients within the level-2 models (ridge regression). MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner

Effects on accuracy improvement of multi-sensor data fusion and ensemble learning

Figure 11 shows the comparison of the effects on accuracy improvement of ensemble learning and data fusion based on the best performing individual ML model. For individual sensors, ensemble learning improved prediction accuracy (R2) by up to 5%, while prediction accuracy in multi-sensor fusion improved by up to 31%. The collinearity between the predicted results of three sensors was modest (VIF < 5) (Fig. 12a). This shows that fusion of multi-sensor data can effectively provide supplementary information. Moreover, the VIF of each individual ML model was high (VIF > 10) (Fig. 12b). This indicates a high degree of collinearity among the predicted values by individual ML models. The results show that there was higher overlapping information between individual ML model predictions as compared to information between different sensors, which indicated different sensor data being able to complement each other.
Fig. 11

Comparison of accuracy improvement (R2) of ensemble learning and data fusion. MS multi-spectral features, TIR thermal infrared features, DNN deep neural network, RF random forest, StRR stacking regression using ridge regression as a secondary learner

Fig. 12

Variance inflation factor (VIF) for the output predictions of a each individual sensor b each machine learning model. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner

Comparison of accuracy improvement (R2) of ensemble learning and data fusion. MS multi-spectral features, TIR thermal infrared features, DNN deep neural network, RF random forest, StRR stacking regression using ridge regression as a secondary learner Variance inflation factor (VIF) for the output predictions of a each individual sensor b each machine learning model. MS multi-spectral features, TIR thermal infrared features, SVM support vector machine, DNN deep neural network, RR ridge regression, RF random forest, StRR stacking regression using ridge regression as a secondary learner

Mapping predicted grain yield

On the test set, 2800 yield prediction values were generated after 80 random divisions of the origin data. The final prediction of the entire experiment was determined by averaging the numerous yield prediction values of each plot. Heat maps (Fig. 13) of yield predictions obtained by StRR and data fusion were shown for the three irrigation treatments, respectively. The changes in wheat yield under different irrigation treatments can be seen clearly. Grain yield increased with the increase in irrigation water. The t-test results (Fig. S1) demonstrated that the predicted yield values under various irrigation treatments were significantly different, which was consistent with the measured grain yield (Fig. 3). The mapping results indicate that by combining ensemble learning and multi-sensor data fusion, reliable yield prediction values were obtained.
Fig. 13

Spatial distribution of predicted grain yield (t ha−1) at the plot scale using multi-sensor fusion and ensemble learning. ***Indicates significant at the 0.001 level. MS multi-spectral features, TIR thermal infrared features

Spatial distribution of predicted grain yield (t ha−1) at the plot scale using multi-sensor fusion and ensemble learning. ***Indicates significant at the 0.001 level. MS multi-spectral features, TIR thermal infrared features

Discussion

Comparison of prediction accuracy between individual sensors

A MS sensor has been successfully applied for assessment of biophysical or biochemical variables in crops (Hassan et al., 2018, 2019; Yang et al., 2020). However, the results in this study showed that the predictions from MS features were not highly accurate (Table 4). It might be because MS VIs readily saturate at high greenness or due to thickness of plants in wheat and rice (Fu et al., 2014; Thenkabail et al., 2000). Texture information extracted from MS images didn’t provide more accurate estimates than VIs for crop parameter evaluation in a previous study (Liu et al., 2019). This is probably the reason that the combination of MS texture and VIs did not yield the desired yield prediction results in this study. Higher accuracy of grain yield prediction was observed using RGB features compared to MS features in all cases (Table 4). It is probably due to the following reasons: first, the spatial resolution of RGB images is higher than that of MS images. Second, MS images cover many mixed pixels of wheat and background (e.g., soil). In addition to the VIs, plant height was also extracted from the RGB images, which has positive effects on yield prediction. In several reports, UAV-based crop height was strongly associated with crop biomass (Han et al., 2019; Li et al., 2020), which are important indicators of crop grain yield. Combining UAV-based crop height with NDVI or normalized difference yellowness index (NDYI) has effectively improved the yield prediction accuracy of rice (Wan et al., 2020). The use of UAV-based thermal remote sensing in precision agriculture has been shown to be beneficial for assessment of plant growth (Ludovisi et al., 2017; Sheng, et al., 2010). For some ML models, such as Cubist and RR, the predictions based on TIR features are comparable to RGB features. In general, plant water content affects crop yield, and crop canopy temperature is also closely related to water content. This leads to similar changes in crop yield and canopy temperature across irrigation treatments, resulting in high accuracy of yield prediction from TIR features, while TIR features are not subject to the easy saturation that occurs with VIs.

Analysis of multi-sensor data fusion

Data fusion from various sensor combinations (MSI + TIR, RGB + MS, RGB + TIR and RGB + MS + TIR) was used to predict yield and compare with the predictions based on individual sensor data (Table 4). In general, multi-sensor data fusion can yield higher prediction accuracy than individual-sensor data for most ML models. This is because multiple information such as canopy spectral, structural and temperature all contribute to grain yield prediction in unique and complimentary ways (Maimaitijiang et al., 2020). Among the five individual ML algorithms, the RF algorithm performed better with MS + TIR data fusion. The DNN performed best with RGB data, but the RF algorithm did not significantly underperform the DNN. RF algorithm also performed well in both RGB + TIR and MS + TIR data fusion. The modeling results of RF algorithm were similar to a previous study which reported that RF has higher generalization ability than other algorithms when combining remote sensing data to evaluate plant parameters (Han et al., 2019; Wang et al., 2016). The DNN model also performed best in data fusion of RGB + MS and all three-sensor data. This may be due to the fact that deep learning tends to outperform the most popular ML methods when dealing with larger sample sizes, complex, non-linear and redundant datasets (Kang & Kang, 2017). To further understand the contribution of each sensor's information for predicting yield, the relative contribution of each sensor to the modeling performance of fusing all sensors was analyzed as the percent change in the accuracy parameters when corresponding sensor features were excluded from the modeling procedure (Table S1). The results show that in most cases, the accuracy parameters of each ML model varied the most when RGB features were excluded, followed by TIR features. The changes in the accuracy parameters of each ML model were weak when MS features were excluded, especially for RF and StRR. The desired prediction accuracy can be obtained using only RGB and TIR sensors which would contribute to the reduction of sensor costs.

The potential of ensemble learning

Apart from using a variety of sensors to gain a better understanding of plant behavior and response, big data analytics have opened up new possibilities for the creation of innovative methods to extract more information as a result of advance in computing power (Shah et al., 2019). Previous studies have often used an individual ML algorithm to estimate different crop parameters (Matese & Di Gennaro, 2021; Shafiee et al., 2021), but an individual ML algorithm has limitations in parsing different types of data. In precision agriculture management, slight differences in estimation accuracy may lead to different decisions, so it is worthwhile to explore the possibility of obtaining higher prediction accuracy. Ensemble learning results in this study are consistent with previous studies where ensemble models achieved higher prediction accuracy than individual models under various modeling conditions (Feng et al., 2020b; Fu et al., 2019), which demonstrates the reliability of the ensemble approach. In most studies in estimating plant traits using stacking regression, linear models are often used as secondary learners. Some weight assignment methods, such as decision-level fusion, quadratic programming and Bayesian model averaging (BMA), can also be used as secondary learners for estimating crop parameters, and these methods have been shown to be effective in other fields to improve prediction accuracy in ensemble learning (Wang et al., 2020, 2021; Yin et al., 2021). Ensemble learning requires a complete training of each base model to reach its potential, which inevitably adds a significant amount of model training time compared to the best-performing individual model. How to balance model complexity and accuracy still needs to be explored in future research.

Limitations and implications

For this study, wheat canopy MS, RGB and TIR images were collected rapidly and non-destructively using a UAV platform, and both ensemble learning and multi-sensor data fusion yielded higher prediction accuracy than traditional methods. The more obvious accuracy improvement effect of multi-sensor fusion make it one of the main research directions for precision agriculture in the future (Fig. 11). Abundant data from widely used sensors such as LiDAR, SARs or crop parameters simulated by crop model (Araya et al., 2010; de Wit et al., 2019), can be fused with the data obtained from the sensors used in this study to enhance the prediction accuracy of crop yield and the stability of the model. The feature extraction process was carried out for each sensor image, and the direct use of raw image information directly as input features for advanced deep learning algorithms has the potential to uncover more potential information that should be explored in the future for multi-sensor image fusion (Sagan et al., 2021). Thirty varieties were used in this study, which makes the yield data richer and helps to obtain a higher prediction accuracy. However, most of these varieties were from China's Yellow and Huai Valleys Winter Wheat Zone (YHVWWZ), which may have some regional limitations. Whether the present method can obtain similar results in yield prediction experiments in other wheat regions requires further research.

Conclusion

The present study explored the potential of fusing the data from UAV-based RGB, MS and TIR sensors, as well as of integrating the modeling capabilities of Cubist, SVM, DNN, RR and RF in wheat yield prediction. The results showed that the use of both multi-sensor data fusion and ensemble learning have improved the wheat yield prediction accuracy. The differences in yield under different irrigation treatments was successfully estimated, which is helpful for breeding work to evaluate the performance of each cultivar under different irrigation treatments. To further assess the stability of the proposed method in plant breeding, it should be tested under more developmental stages and growth environments. Below is the link to the electronic supplementary material. Supplementary file1 (DOCX 138 KB)
  16 in total

1.  Bayesian machine learning ensemble approach to quantify model uncertainty in predicting groundwater storage change.

Authors:  Jina Yin; Josué Medellín-Azuara; Alvar Escriva-Bou; Zhu Liu
Journal:  Sci Total Environ       Date:  2021-01-20       Impact factor: 7.963

2.  Forecasting wheat and barley crop production in arid and semi-arid regions using remotely sensed primary productivity and crop phenology: A case study in Iraq.

Authors:  Sarchil Hama Qader; Jadunandan Dash; Peter M Atkinson
Journal:  Sci Total Environ       Date:  2017-09-12       Impact factor: 7.963

3.  Estimating leaf area index using unmanned aerial vehicle data: shallow vs. deep machine learning algorithms.

Authors:  Shuaibing Liu; Xiuliang Jin; Chenwei Nie; Siyu Wang; Xun Yu; Minghan Cheng; Mingchao Shao; Zixu Wang; Nuremanguli Tuohuti; Yi Bai; Yadong Liu
Journal:  Plant Physiol       Date:  2021-11-03       Impact factor: 8.005

4.  Canopy Temperature and Vegetation Indices from High-Throughput Phenotyping Improve Accuracy of Pedigree and Genomic Selection for Grain Yield in Wheat.

Authors:  Jessica Rutkoski; Jesse Poland; Suchismita Mondal; Enrique Autrique; Lorena González Pérez; José Crossa; Matthew Reynolds; Ravi Singh
Journal:  G3 (Bethesda)       Date:  2016-09-08       Impact factor: 3.154

5.  Predicting grain yield using canopy hyperspectral reflectance in wheat breeding data.

Authors:  Osval A Montesinos-López; Abelardo Montesinos-López; José Crossa; Gustavo de Los Campos; Gregorio Alvarado; Mondal Suchismita; Jessica Rutkoski; Lorena González-Pérez; Juan Burgueño
Journal:  Plant Methods       Date:  2017-01-03       Impact factor: 4.993

6.  An automated, high-throughput plant phenotyping system using machine learning-based plant segmentation and image analysis.

Authors:  Unseok Lee; Sungyul Chang; Gian Anantrio Putra; Hyoungseok Kim; Dong Hwan Kim
Journal:  PLoS One       Date:  2018-04-27       Impact factor: 3.240

7.  Hyperspectral Leaf Reflectance as Proxy for Photosynthetic Capacities: An Ensemble Approach Based on Multiple Machine Learning Algorithms.

Authors:  Peng Fu; Katherine Meacham-Hensold; Kaiyu Guan; Carl J Bernacchi
Journal:  Front Plant Sci       Date:  2019-06-03       Impact factor: 5.753

8.  UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought.

Authors:  Riccardo Ludovisi; Flavia Tauro; Riccardo Salvati; Sacha Khoury; Giuseppe Mugnozza Scarascia; Antoine Harfouche
Journal:  Front Plant Sci       Date:  2017-09-27       Impact factor: 5.753

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.