| Literature DB >> 35923874 |
Chen Sun1,2, Jing Zhou1, Yuchi Ma1, Yijia Xu1, Bin Pan3, Zhou Zhang1.
Abstract
Potato is one of the most significant food crops globally due to its essential role in the human diet. The growing demand for potato, coupled with severe environmental losses caused by extensive farming activities, implies the need for better crop protection and management practices. Precision agriculture is being well recognized as the solution as it deals with the management of spatial and temporal variability to improve agricultural returns and reduce environmental impact. As the initial step in precision agriculture, the traditional methods of crop and field characterization require a large input in labor, time, and cost. Recent developments in remote sensing technologies have facilitated the process of monitoring crops and quantifying field variations. Successful applications have been witnessed in the area of precision potato farming. Thus, this review reports the current knowledge on the applications of remote sensing technologies in precision potato trait characterization. We reviewed the commonly used imaging sensors and remote sensing platforms with the comparisons of their strengths and limitations and summarized the main applications of the remote sensing technologies in potato. As a result, this review could update potato agronomists and farmers with the latest approaches and research outcomes, as well as provide a selective list for those who have the intentions to apply remote sensing technologies to characterize potato traits for precision agriculture.Entities:
Keywords: potato; precision agriculture; remote sensing; satellite imagery; sensors; unmanned aerial system
Year: 2022 PMID: 35923874 PMCID: PMC9339983 DOI: 10.3389/fpls.2022.871859
Source DB: PubMed Journal: Front Plant Sci ISSN: 1664-462X Impact factor: 6.627
Sensors and their applications in potato trait characterization.
| Sensors | Manufacturers | Models | Image resolution | Spectral range (nm) | No. of bands | Applications |
| Visible red–green–blue (RGB) | DJI | Phantom 4 Pro | 5,472 × 3,648 | 400–700 | 3 | Plant height estimation ( |
| Sony | NEX-5N | 1,920 × 1,080 | Disease detection ( | |||
| Canon | A2200 | 4,320 × 3,240 | Disease detection ( | |||
| ELPH 115 IS | 4,608 × 3,456 | Growth status prediction ( | ||||
| Panasonic | GX1 | 4,592 × 3,448 | Disease detection ( | |||
| Multispectral | Tetracam | Micro-MCA RGB + 3 | 1,280 × 1,024 | 350–950 | 6 | Yield prediction ( |
| ADC Lite | 2,048 × 1,536 | 520–920 | 3 | Crop scouting ( | ||
| Mini-MCA | 1,280 × 1,024 | 450–1,000 | 6 | Beetle damage assessment ( | ||
| Kodak | Megaplus 4.2i cameras (customized) | 2,024 × 2,044 | 400–1,000 | 3 | Yield prediction ( | |
| Canon | S95 (customized) | 3,648 × 2,736 | 400–1,000 | 4 | Chlorophyll content estimation ( | |
| S110 (customized) | 4,000 × 3,000 | 400–1,000 | 3 | Disease detection ( | ||
| Powershot ELPH 340 HS (customized) | 4,608 × 3,456 | 400–1,000 | 3 | Growth status estimation ( | ||
| ELPH 110 (customized) | 4,608 × 3,456 | 400–1,000 | 3 | Hail damage assessment ( | ||
| Redlake | MS4100 | 1,920 × 1,080 | 400–1,000 | 3 | Nitrogen stress prediction ( | |
| MicaSense | RedEdge | 1,280 × 960 | 400–1,000 | 5 | Structural change ( | |
| Hyperspectral | Headwall | Nano-Hyperspec | 640 × 1 | 400–1,000 | 270 | Yield prediction ( |
| Specim | ImSpector V10 2/3” | Depending on the combined imager | 400–950 | 61 | Chlorophyll content assessment, leaf area index (LAI) estimation and ground cover assessment ( | |
| Cubert | UHD 185 | 1,000 × 1,000 | 450–950 | 125 | Chlorophyll content assessment ( | |
| Rikola | Fabry-Perot interferometer | 1,010 × 1,010 | 500–900 | 16 | Disease detection ( | |
| Thermal infrared | Infrared cameras | Microbolometer | 640 × 480 | 7,000–14,000 | Chlorophyll content assessment ( | |
| Wilsonville | TAU 640 | 640 × 512 | 7,500–13,500 | Crop scouting ( | ||
| FLIR | E60 | 320 × 240 | 7,500–13,000 | Water status ( | ||
| FLIR | SC2000 | 320 × 240 | 7,500–13,000 | Water status ( | ||
| FLIR | SC655 | 640 × 480 | 7,500–13,000 | Water status ( | ||
| JeanOptics | IDM200 | 640 × 480 | 7,500–13,000 | Water status ( | ||
| Fluke | Ti-32 | 320 × 240 | 7,500–14,000 | Water status ( | ||
| Fluke | TiR1 | 160 × 120 | 7,500–14,000 | Water status ( | ||
| Telops | HyperCamLW | 320 × 256 | 7,700–11,500 | Water status ( | ||
| Palmer Wahl Instruments | HSI3000 | 160 × 120 | 8,000–14,000 | Water status ( | ||
| Light detection and ranging (LiDAR) | RIEGL | VUX-1 | Crop height and biomass estimation ( |
Satellite-based remote sensing platforms and their applications in potato trait characterization.
| Satellite | Instrument | Spatial resolution (m/pixel) | Temporal resolution (days) | Spectral range (nm) | No. of bands | Applications |
| Terra | Moderate Resolution Imaging Spectroradiometer (MODIS) | 250–1,000 | 1–2 | 400–1,440 | 36 | Yield prediction ( |
| Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) | 15–90 | 4–16 | 500–12,000 | 14 | Herbicide rate assessment ( | |
| Planet scope | PS2 | 3 | 1 | 455–860 | 4 | Yield prediction ( |
| Resourcesat-1 | Advanced Wide Field Sensor (AWiFS) | 56 | 5 | 520–1,700 | 3 | Disease assessment ( |
| Sentinel-2 | Multispectral Instrument (MSI) | 10–60 | 5 | 400–2,190 | 13 | Leaf area index (LAI) and chlorophyll content estimation ( |
| Sentinel-3 | The Ocean and Land Color Instrument (OLCI) | 300 | 27 | 400–1,020 | 21 | Nitrogen stress prediction ( |
| Landsat-5 | Thematic Mapper (TM) | 30–120 | 16 | 450–12,500 | 7 | Yield prediction ( |
| Landsat-7 | Enhanced Thematic Mapper Plus (ETM+) | 15–60 | 16 | 450–12,500 | 8 | Yield prediction ( |
| Landsat-8 | Operational Land Imager (OLI) | 30 | 16 | 433–1,390 | 9 | Yield prediction ( |
| Formosat-2 | Remote Sensing Instrument | 8 | 1 | 450–900 | 4 | Spectral-Temporal Response Surfaces (STRS) estimation ( |
| Proba-1 | Compact High Resolution Imaging Spectrometer (CHRIS) | 18 | 2 | 400–1,050 | 19 | LAI estimation ( |
| Polar Orbiting Environmental Satellites (POES) | The Advanced Very High-Resolution Radiometer (AVHRR) | 1,100 | 7/14 | 580–12,500 | 5 | Yield prediction ( |
UAV-based remote sensing platforms and their applications in potato traits characterization.
| UAV type | Manufacturer | Model | On board sensors | Max. speed (km/h) | Duration (min) | Applications |
| Quadcopters | DJI | Phantom 4 Pro | Visible RGB | 50–72 | 30 | Plant height estimation ( |
| Quadcopters | DJI | Inspire 2 | RGB; Multispectral | 94 | 25 | Yield prediction ( |
| Quadcopters | 3D Robotics | Solo | Visible RGB | 80 | 25 | Growth status prediction ( |
| Quadcopters | DJI | Matrice 100 | Multispectral | 61–79 | 20–40 | Radiation use efficiency assessment ( |
| Hexacopter | DJI | Matrice 600 Pro | Customizable | 65 | 25 | Yield prediction ( |
| Hexacopter | Tarot | 680 Pro | Customizable | – | – | Structural change ( |
| Octocopter | Aerialtronics | Altura AT8 | Customizable | – | – | Chlorophyll content assessment, leaf area index (LAI) estimation and ground cover assessment ( |
| Octocopter | Riegl | RiCOPTER | Visible RGB; LiDAR | 30 | 30 | Crop height and biomass estimation ( |
| GmbH HiSystems | Unspecified | Visible RGB | – | – | Disease detection ( | |
| Fixed-wing | Trimble | UX5 HP | Visible RGB | 80 | 35 | Plant height estimation ( |
| Fixed-wing | senseFly | eBee micro | Visible RGB; Multispectral | 110 | 25–50 | Disease detection ( |
*Better versions are available now.
Applications of remote sensing technologies in Potato traits characterization.
| Application | Platform | Sensor | Model | Sensor-derived feature | References | |
|
| UAV | Hyperspectral camera | OLS; Ridge; PLSR; SVR; RF; AdaBoost | Full spectra |
| |
| RGB, multi- and hyper- spectral cameras | RF; PLSR | 13 narrow-band VIs |
| |||
| Multispectral camera | ANN | NDVI |
| |||
| RF; SVR | 15 wide-band VIs |
| ||||
| Satellite | Terra MODIS | LR | VIs |
| ||
| RF; SVM; LR | NDVI |
| ||||
| Sentinel-2 | 9 machine learning models | 54 features including band reflectance and VIs |
| |||
| Landsat-8; Sentinel-2 | LR | NDVI; SAVI |
| |||
|
| UAV | RGB and Hyperspectral camera | RF; PLSR | 13 narrow-band VIs; Image-based plant height |
| |
| LiDAR | 3D Profile Index ( | LiDAR point cloud |
| |||
| Handheld | Hyperspectral spectrometer | PLSR, MLR, RF | NDSI, RSI, DSI |
| ||
| GA; ANFIS | 20 narrow-band VIs |
| ||||
| RF | 12 narrow-band VIs |
| ||||
| TIR and RGB cameras | MLR; ANFIS | NRCT and 12 wide-based VIs |
| |||
| UAV; Satellite; handheld | Multispectral camera; Sentinel-2; multispectral spectrometer | Carnegie–Ames–Stanford approach | RVI and raw bands |
| ||
|
| UAV; handheld | TIR and RGB cameras | LR | CWSI |
| |
| Handheld | TIR camera | – | CWSI |
| ||
| MLR; ANFIS | NRCT |
| ||||
| LR | CWSI |
| ||||
|
| UAV | Multispectral camera | LR | NDVI; GNDVI |
| |
| MAV | Multispectral camera | LR | GNDVI; GRVI; NDVI; NG |
| ||
| Hyperspectral spectrometer | PLSR | NG and 6 narrow-band VIs |
| |||
| Handheld | Multispectral radiometer | LR | 8 wide-band VIs |
| ||
| Hyperspectral spectrometer | LR | 4 SWIR-based indices |
| |||
| Multispectral camera | LR | NDVI; RVI; RRE; RRE/GC |
| |||
|
| PVY | Ground vehicle | RGB-Depth camera; hyperspectral line scan camera | FCN | Full spectra |
|
| Handheld | Hyperspectral spectrometer | SVM | 5 EM wavelength segments |
| ||
| PLS-DA | NDSI for all combinations of wavelengths |
| ||||
| Late blight | UAV | RGB camera | Color threshold | HSV color space |
| |
| Multispectral camera | – | NDVI |
| |||
| CNN; SVR; RF | Differences between G and B as well as between NIR and G |
| ||||
| Satellite | Terra MODIS; AWiFS | Clustering | NDVI and LSWI |
| ||
| Handheld | Hyperspectral spectrometer | ANOVA; Stepwise Discriminant Analysis | NDVI; SR; SAVI; REI; and full spectra |
| ||
| Early blight | Ground vehicle | Hyperspectral camera | PLS-DA; SVM | Full spectra; five wide-band VIs |
| |
| Handheld | Hyperspectral spectrometer | PCA; spectral change analysis; and PLSR | Full spectra |
| ||
|
| UAV | RGB camera and hyperspectral sensor | SfM; RF and PLSR | RGB images and 13 narrow-band VIs |
| |
| LiDAR scanner | 3DPI | 3D-point cloud |
| |||
| RGB camera | SfM; “trace all triangles” method and “do not track breaklines” method | RGB images |
| |||
|
| Handheld | Hyperspectral spectrometer | PLSR | Full spectra NDVI; REIP |
| |
| Handheld; satellite | Hyperspectral spectrometer; Landsat TM/ETM+ | Linear, exponential, and logarithmic regression | NDVI; SAVI; WDVI |
| ||
| UAV | Multispectral camera | PROSAIL model inversion | Multi-angular multispectral data |
| ||
| Satellite | Sentinel-2 MSI | LR | WDVI |
| ||
| Landsat-8 OLI; Sentinel-2 MSI | LR; SNAP model | SAVI; NDVI; EVI2; SeLI |
| |||
|
| Handheld | RGB camera | ANN; LR | Mean brightness parameters and mean brightness ratio |
| |
| Hyperspectral spectrometer | PLSR | Full spectra; normalized spectra; MFs spectra |
| |||
| Multispectral camera | LR; MLR | mean and std of R and G |
| |||
| Handheld; satellite | Hyperspectral spectrometer; REIS | LR | TCARI/OSAVI, TCI/OSAVI and CVI |
| ||
| UAV; satellite | Hyperspectral spectrometer; FORMOSAT-2 | LR | Spectral–temporal response surfaces |
| ||
|
| UAV | RGB camera | Mask R-CNN | RGB images |
| |
| RF | Morphological features |
| ||||
| Multispectral camera | LR | NDVI |
| |||
Abbreviations for Table 3: OLS, ordinary least squares; PLSR, partial least square regression; SVR, support vector regression; RF, random forest; VI, vegetation index; ANN, artificial neural network; NDVI, normalized difference vegetation index; LR, linear regression; SVM, support vector machine; SAVI, soil adjusted vegetation index; NDSI, normalized difference snow index; RSI, ratio spectral index; DSI, difference spectral index; MLR, multiple linear regression; GA, genetic algorithm; ANFIS, adaptive neuro-fuzzy inference system; NRCT, normalized relative canopy temperature; RVI, ratio vegetation index; CWSI, crop water status index; GNDVI, green normalized difference vegetation index; GRVI, green and red ratio vegetation index; NG, normalized green; RRE, ratio red edge vegetation index; GC, canopy cover; FCN, fully convolution neural network; EM, electromagnetic spectrum; PLS-DA, partial least-squares discriminate analysis; CNN, convolution neural network; G, green; B, blue; NIR, near infrared; LSWI, integration of leaf water content index; ANOVA, analysis of variance; SR, simple ratio; REI, red edge index; PCA, principal components analysis; SfM, structure from motion; 3DPI, 3D profile index; REIP, red-edge inflection point; WDVI, weighted difference vegetation index; PROSAIL, the coupling of the PROSPECT and SAIL models; SANP, sentinel application platform; EVI2, enhanced vegetation index 2; SeLI, sentinel-2 LAI index; REIS, RapidEye earth-imaging system; TCARI, transformed chlorophyll absorption in reflectance index; OSAVI, optimized soil-adjusted vegetation index the optimized soil-adjusted vegetation index; TCI, triangular chlorophyll index; and CVI, chlorophyll vegetation index.