| Literature DB >> 36059604 |
Rui Xu1, Changying Li1,2.
Abstract
Manual assessments of plant phenotypes in the field can be labor-intensive and inefficient. The high-throughput field phenotyping systems and in particular robotic systems play an important role to automate data collection and to measure novel and fine-scale phenotypic traits that were previously unattainable by humans. The main goal of this paper is to review the state-of-the-art of high-throughput field phenotyping systems with a focus on autonomous ground robotic systems. This paper first provides a brief review of nonautonomous ground phenotyping systems including tractors, manually pushed or motorized carts, gantries, and cable-driven systems. Then, a detailed review of autonomous ground phenotyping robots is provided with regard to the robot's main components, including mobile platforms, sensors, manipulators, computing units, and software. It also reviews the navigation algorithms and simulation tools developed for phenotyping robots and the applications of phenotyping robots in measuring plant phenotypic traits and collecting phenotyping datasets. At the end of the review, this paper discusses current major challenges and future research directions.Entities:
Year: 2022 PMID: 36059604 PMCID: PMC9394113 DOI: 10.34133/2022/9760269
Source DB: PubMed Journal: Plant Phenomics ISSN: 2643-6515
Figure 1Diagram of a phenotyping robot.
Summary of the ground HTFP systems.
| System | Sensors | Crop | Phenotypical traits | Advantages | Disadvantages |
|---|---|---|---|---|---|
| Tractor-based system | High payload | Large vibration | |||
| [ | RGB camera | Wheat | Canopy height | ||
| BreedVision [ | 3D camera | Triticale | Plant height | ||
| [ | Infrared temperature sensor | Cotton | Canopy height | ||
| [ | Ultrasonic sensor | Cotton | Plant height | ||
| Phenoliner [ | RGB camera | Grape | Plant count | ||
| GPhenoVision [ | RGBD camera | Cotton | Plant height | ||
| ProTractor [ | RGB camera | Brassica | Seedling count | ||
|
| |||||
| Pushcart | Light weight | Not practical for large field | |||
| [ | Ultrasonic sensor | Soybean | Canopy height | ||
| Phenocart [ | Spectral reflectance sensor | Wheat | Canopy temperature | ||
| Proximal sensing cart [ | Ultrasonic sensor | Cotton | Canopy height | ||
| Phenocart [ | RGB camera | Wheat | Biomass | ||
| [ | Hyperspectral camera | Tobacco | |||
| Motorized pushcart | |||||
| Professor [ | Not specified | Wheat | Not specified | ||
| [ | LiDAR sensor | Wheat | NDVI | ||
|
| |||||
| Gantry system | Fully autonomous | Fixed experimental site | |||
| Field Scanalyzer [ | Thermal camera | Wheat | Plant morphology | ||
| PhénoField [ | LiDAR sensor | Wheat | Green cover fractions | ||
| Cable system | |||||
| Field phenotyping platform [ | RGB camera | Wheat | Enhanced NDVI | ||
| NU-Spidercam [ | Multispectral camera | Maize | Canopy cover | ||
Figure 2Phenotyping robots. (a) Vinobot [24]. (b) Shrimp [25]. (c) Robotanist [26]. (d) TerraSentia [27]. (e) VinBot [28]. (f) MARIA [29]. (g) RobHortic [30]. (h) AgBotII [31]. (i) Phenobot 1.0 [32]. (j) AgriRover-01 [33]. (k) DairyBioBot [34]. (l) Phenobot 3.0 [35]. (m) Thorvald II [36]. (n) Ladybird [37]. (o) Phenomobile V1 [38]. (p) Flex-Ro [39]. (q) MARS X [40]. (r) Armadillo Scout [41]. (s) PHENObot [42]. (t) A robot based on LT2 [43]. (u) TERRA-MEPP [44]. (v) Phenomobile V2 [45]. (w) BoniRob [46].
Summary of phenotyping robots categorized by drive mechanism. 4WD4WS: four-wheel driving and four-wheel steering; 2WD2WS: two-wheel driving and two-wheel steering.
| Robot | Phenotyping sensor | Perception sensor | Crop | Advantages | Disadvantages | Applications |
|---|---|---|---|---|---|---|
| Wheeled robot (skid steering) | ||||||
| VinBot [ | RGB-D camera | RTK-GNSSIMU | Grape | Simple mechanical structure | Low power efficiency in turning | Yield estimation [ |
| Shrimp [ | 3D LiDAR | RTK-GNSSIMU | Horticultural crop | Mango fruit detection [ | ||
| Robotanist [ | Stereo camera | RTK-GNSSIMU2D LiDAR | Sorghum | Corn stalk count and width estimation [ | ||
| Vinobot [ | Trinocular camera | GNSS2D LiDAR | Maize | Simulation of Vinobot [ | ||
| TerraSentia [ | Multispectral camera2D LiDAR | GNSS | Maize | Corn stem width estimation [ | ||
| MARIA [ | 2D LiDAR | RTK-GNSSIMU | Not specified | Simulation of an agricultural robot [ | ||
| [ | RGB-D camera | RTK-GNSS2D LiDAR | Maize | Corn stalk diameter estimation [ | ||
|
| ||||||
| Wheeled robot (differential drive) | ||||||
| RobHortic [ | Thermal camera | RTK-GNSS | Horticultural crop | Simple mechanical structure | Less precise in steering control | Carrot disease detection [ |
|
| ||||||
| Wheeled robot (2WD2WS and Ackerman steering) | ||||||
| AgBotII [ | RGB camera | RTK‐GNSS | Row crop | Precise steering control | Need coordination between drive wheel and steering wheel | Weed detection [ |
| Phenobot 1.0 [ | Stereo camera | RTK‐GNSS | Sorghum | Sorghum plant height and stalk diameter estimation [ | ||
| AgriRover-01 [ | 3D LiDAR | RTK‐GNSS | Corn | Plant height and row spacing estimation [ | ||
| DairyBioBot [ | 2D LiDAR | RTK‐GNSS | Ryegrass | Ryegrass biomass estimation [ | ||
|
| ||||||
| Wheeled robot (articulated steering) | ||||||
| Phenobot 3.0 [ | Stereo camera | RTK-GNSS | Sorghum | Good mobility for rough terrain | Increased complexity in mechanical structure | |
|
| ||||||
| Wheeled robot (4WD4WS) | ||||||
| Thorvald II [ | Application dependent | GPSIMU2D LiDAR | Row crop | High maneuverability | Complex mechanical structure | Development of a strawberry harvesting robot [ |
| Ladybird [ | RGB camera2D LiDAR | 2D LiDAR | Row crop | Weed detection [ | ||
| Phenomobile V1 [ | 2D LiDAR | RTK-GNSS | Row crop | Estimation of plant height from LiDAR measurement [ | ||
| AgRover [ | Not specified | RTK-GNSS | Row crop | |||
| Flex-Ro [ | RGB camera | GNSS | Row crop | |||
| MARS X [ | Application dependent | RTK-GNSS | Row crop | |||
|
| ||||||
| Tracked (differential drive) | ||||||
| Armadillo Scout [ | Application dependent | GNSS2D LiDAR | Not specified | Good mobility for rough terrain | Complex mechanical structure | |
| PHENObot [ | RGB camera | RTK-GNSS | Grape | Grape bunch and berry detection [ | ||
| [ | RGB camera | RTK-GNSS | Soybean | |||
| [ | RGB camera | |||||
| TERRA-MEPP [ | Stereo camera | RTK-GNSS | Sorghum | |||
| Phenomobile V2 [ | 2D LiDAR | RTK-GNSSIMU | Wheat | |||
|
| ||||||
| Wheel-legged (4WD4WS) | ||||||
| BoniRob [ | Application dependent | RTK-GNSS | Row crop | High maneuverability | Complex mechanical structure | Soil compaction and moisture measurement [ |
Figure 3Simulation of various phenotyping robots. (a) Thorvald II robots in Gazebo simulator [102]. (b) Simulation of an omnidirectional mobile robot in a typical vineyard in Gazebo simulator [103]. (c) Simulation of a row-following robot in polytunnels [60]. (d) Simulation of LiDAR-based navigation in row crop using MARIA [54]. (e) Simulation of Vinobot in Gazebo simulator and its visualization in Rviz [51]. (f) Simulation of agriculture field using AgROS [110]. (g) Simulation of BoniRob in Gazebo simulator [46].
Applications of phenotyping robots.
| Crop | Key issues | Robot | Phenotyping sensor | Data processing method | Reference |
|---|---|---|---|---|---|
| Organ identification and counting | |||||
| Almond | Almond fruit detection | Shrimp [ | RGB camera | Use Faster R-CNN to detect fruit in the color image | [ |
| Not mentioned | Plant detection and leaf count | BoniRob [ | RGB camera | A customized single-stage object detection network based on FCN | [ |
| Kiwifruit | Kiwifruit detection | Customized tracked robot | RGB camera | Image features were extracted and classified using machine learning | [ |
| Mango | Mango fruit detection, localization, and yield prediction | Shrimp [ | RGB camera | Use FR-CNN for detecting fruits in the color image | [ |
|
| |||||
| Crop detection and classification | |||||
| Corn | Corn plant detection and mapping | Volksbot RT-3 | LiDAR sensor | Detect a plan as ground | [ |
| Corn | Corn stalk count and stalk width estimation | Robotanist [ | Stereo camera | Use Faster R-CNN to detect stalk and FCN to get stalk mask | [ |
| Corn | Corn stand count | TerraSentia [ | RGB camera | Use Faster R-CNN to detect corn stand | [ |
| Carrot | Weed detection | BoniRob [ | Multispectral camera | Classification of weed and plant is achieved using the Random Forest classifier | [ |
| Sugar beet | Dataset collection for plant classification, localization, and mapping | BoniRob [ | Multispectral camera | [ | |
|
| |||||
| Crop growth monitoring | |||||
| Sorghum | Sorghum height and stem diameter estimation | Phenobot 1.0 [ | Stereo camera | Reconstruct dense point cloud from stereo image | [ |
| Sorghum | Sorghum height, width, stem diameter, plant volume, and surface area estimation | Phenobot 1.0 [ | Stereo camera | Use convex hull to estimate plant volume and surface area | [ |
| Corn | Corn stalk diameter estimation | Customized skid steering robot | RGB-D camera | Use YOLO V4 to detect corn stalk | [ |
| Corn | Plant height, leaf area index estimation | Vinobot [ | RGB camera | 3D point cloud of the plant was constructed using structure from motion | [ |
| Almond | Mapping canopy volume, flowers, fruit, and yield estimation | Shrimp [ | RGB camera | Use color image to detect flower and fruits | [ |
| Ryegrass | Ryegrass biomass yield estimation | DairyBioBot [ | LiDAR sensor | Estimate the plant volume from LiDAR point cloud and correlate with the yield | [ |