| Literature DB >> 32392872 |
Spyros Fountas1, Nikos Mylonas1, Ioannis Malounas1, Efthymios Rodias1, Christoph Hellmann Santos2, Erik Pekkeriet3.
Abstract
Modern agriculture is related to a revolution that occurred in a large group of technologies (e.g., informatics, sensors, navigation) within the last decades. In crop production systems, there are field operations that are quite labour-intensive either due to their complexity or because of the fact that they are connected to sensitive plants/edible product interaction, or because of the repetitiveness they require throughout a crop production cycle. These are the key factors for the development of agricultural robots. In this paper, a systematic review of the literature has been conducted on research and commercial agricultural robotics used in crop field operations. This study underlined that the most explored robotic systems were related to harvesting and weeding, while the less studied were the disease detection and seeding robots. The optimization and further development of agricultural robotics are vital, and should be evolved by producing faster processing algorithms, better communication between the robotic platforms and the implements, and advanced sensing systems.Entities:
Keywords: autonomous vehicles; crops; execution; field operations; perception
Year: 2020 PMID: 32392872 PMCID: PMC7273211 DOI: 10.3390/s20092672
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Steps of review methodology.
Weeding agrirobotic systems’ characteristics for various crops.
| Crop | Perception Sensors | Weed Detection | Weed Control | Results | Cited Work |
|---|---|---|---|---|---|
| Maize | Cameras, optical and acoustic distance sensors | Yes | Chemical | No performance metrics provided | [ |
| Carrot | RGB infrared camera | Partly | Chemical | 100% effectiveness with the DoD system | [ |
| Potato, corn | Webcam, solid-state gyroscope | Partly | Chemical | 98% and 89% detection accuracy | [ |
| Sugar beet | Color camera | Yes | Mechanical | Row detection precision < 25 mm. | [ |
| N/A | Stereo vision system, laser | No | Mechanical | Precision < 3 cm. | [ |
| Rice | Laser range finder, IMU | No | Mechanical | Precision < 62 mm | [ |
| Beetroot | Color camera, artificial vision, compass | Yes | Chemical | > 85% detection & destroy, precision < 2 cm | [ |
| Grapes | IMU, hall sensors, electromechanical sensor, a sonar sensor | No | Mechanical | Average performance: 65% (feeler) & 82% (sonar) | [ |
| N/A | Accelerometer, gyroscope, flex sensor | No | Mechanical | No performance metrics provided | [ |
| Tomato | Color camera, SensorWatch | Partly | Chemical | 24.2% were incorrectly identified and sprayed and 52.4% of the weeds were not sprayed. | [ |
Seeding agrirobotic systems’ characteristics for wheat and rice.
| Crop | Perception | Results | Cited Work |
|---|---|---|---|
| Wheat | Force sensor, displacement sensor, angle sensor | The path tracking errors are +/− 5 cm and the angle errors are about zero. | [ |
| Wheat | Signal sensor, angle, pressure & infrared sensors | Qualified rate: 93.3% | [ |
| Rice | Compass, wheel encoder | 92% accuracy & 5 cm error in the dropping position. | [ |
Disease and insect detection agrirobotic systems features for various crops.
| Crop | Perception | Detected Disease | Highest Accuracy | Cited Work |
|---|---|---|---|---|
| Bell pepper | RGB camera, multispectral camera, laser sensor | Powdery mildew & tomato spotted wilt virus | 95% & 90% | [ |
| Cotton, groundnut | RGB camera | Cotton (Bacterial blight, magnesium deficiency), groundnut (leaf spot & anthracnose) | ~90%, 83–96% | [ |
| Olive tree | Two DSLR cameras (one in BNDVI mode), a multispectral camera, a hyperspectral system in visible and NIR range, a thermal camera, LiDAR, an IMU sensor(*) | N/A | [ | |
| Tomato, rice | RGB camera | 94.3% | [ | |
| Strawberry | RGB camera | Powdery mildew | 72%-95% | [ |
(*)DSLR: Digital Single-lens Reflex. BNDVI: Blue Normalised Difference Vegetation Index. NIR: near-infrared. LiDAR: Light detection and ranging.
Plant vigor monitoring agrirobotic systems for various crops.
| Crop | Perception | Results | Cited Work |
|---|---|---|---|
| No specific crop | High-resolution stereo-cameras, 3D LiDAR | Soil sampling. No performance metrics provided. | [ |
| No specific crop | CO2 gas sensor, anemoscope, IR distance measuring sensor | Gas source tracking. CO2 concentration levels up to 2500 ppm were recorded while the robot was moving at a speed of 2 m/min. | [ |
| Orchards | LiDAR, luxmeter | Canopy volume estimation. The system is independent of the light conditions, it is highly reliable and data processing is very fast. | [ |
| Grapes | RGB & IR camera, laser ranger finder, IMU, pressure sensor, etc. | Crop monitoring tasks. No performance metrics provided. | [ |
| Orchards and vineyards | LiDAR, OptRX sensor | Monitor health status and canopy thickness. Terrestrial Laser Scanning (TLS): 2 mm distance accuracy | [ |
| Canola | Ultrasonic sensors, NDVI sensors, IR thermometers, RGB camera | Gather phenotypic data. Maximum measurement error: 2.5% | [ |
Phenotyping agrirobotic systems for various crops.
| Crop | Perception | Autonomy Level | Results | Cited Work |
|---|---|---|---|---|
| Maize & wheat | Cameras, spectral imaging systems, laser sensors, 3D time-of-flight cameras | Autonomous | No performance metrics provided | [ |
| Cotton | Stereo RGB & IR thermal camera, temperature, humidity, light intensity sensors, pyranometer, quantum, LiDAR | Semi-autonomous | RMS error: | [ |
| Sorghum | Stereo camera, RGB camera with fish eye lenses, penetrometer | Autonomous | Stalk detection: 96% | [ |
| Rice, maize & wheat | RGB camera, chlorophyll fluorescence camera, NDVI camera, thermal infrared camera, hyperspectral camera, 3D laser scanner | Fixed site fully automated | Plant height RMS error: 1.88 cm | [ |
| Sugar Beet | Mobile robot: Webcam camera, gigaethernet camera | Autonomous | No performance metrics provided | [ |
| Sorghum | Stereo imaging system consisting of color cameras | Autonomous based on commercial tractor | The image-derived measurements were highly repeatable & | [ |
| Energy Sorghum | Stereo camera, time of flight depth sensor, IR camera | Semi-autonomous | Average absolute error for stem width and plant height: 13% and 15%. | [ |
Spraying agrirobotic systems’ features for various crops.
| Crop | Perception | Real-time Detection | Results | Cited Work |
|---|---|---|---|---|
| Cantaloupe | Robot controller | No | NSGA-II execution time was better 1.5–7% than NSGA-III for the same test cases. (*) | [ |
| N/A | Web camera | Yes | 27% off-target shots, 99.8% of the targets were sprayed by at least one shot. | [ |
| Cucumber | Bump sensors, infra-red sensors, induction sensors | No | Run success:90% & 95%, topside leaf coverage:95% & 90%, underside leaf coverage:95 & 80%, over-spray:20% & 10% for Tests 1 & 2 | [ |
| N/A | Ultrasonic sensors | No | Maximum error varied from 1.2–4.5 cm for self-contained mode on concrete, while for trailer varied from 2.2–4.9 cm. | [ |
| Grapevine | Ultrasonic sensor, color TV camera | No | No performance metrics. | [ |
| N/A | Middle-range sonar, short-range sonar, radar, compass, radar | No | Longitudinal, lateral and orientation errors are close to zero, by including slip varied 10–30%. | [ |
| Grapevine | RGB camera, R-G-NIR multispectral camera | Yes | The sensitivity of the robotic selective spraying was 85%, the selectivity was 92% (8% of the total healthy area was sprayed unnecessarily). | [ |
| Vegetable crops | Hyperspectral camera, stereo vision, thermal IR camera, monocular color camera | Yes | The greedy sort and raster methods are substantially faster than back-to-front scheduling, taking only 68% to 77% of the time. | [ |
(*)NSGA: Non-dominated Sorting Genetic Algorithm.
Harvesting agrirobotic systems’ features for various crops.
| Crop | Perception | Fastest Picking Speed | Highest Picking Rate | Cited Work |
|---|---|---|---|---|
| alfalfa, sudan | Color camera, gyroscope | 2 ha/h (alfalfa) | N/A | [ |
| apple tree | Color camera, time of flight based three dimensional camera | 7.5 sec/fruit | 84% | [ |
| apple tree | Color CCD (Charge Coupled Device) camera, laser range sensor | 7.1 sec/fruit | 89% | [ |
| apple tree | High-frequency light, camera | 9 sec/fruit | 80% | [ |
| cherry | 3D vision sensor with red, IR laser diodes, pressure sensor | 14 sec/fruit | N/A | [ |
| mushroom | Laser sensor, vision sensor | 6.7 sec/mushroom | 69% | [ |
| asparagus | 3D vision sensor with two sets of slit laser projectors & a TV camera | 13.7 sec/asparagus | N/A | [ |
| strawberry | Sonar camera sensor, binocular camera | 31.3 sec/fruit | 86% | [ |
| strawberry | Color CCD cameras, reflection-type photoelectric sensor | 8.6 sec/fruit | 54.9% | [ |
| strawberry | LED light source, three-color CCD cameras, photoelectric sensor, suction device | 11.5 sec/fruit | 41.3% with a suction device | [ |
| strawberry | Color CCD camera, visual sensor | 10 sec/fruit | N/A | [ |
| strawberry | Three VGA (Video Graphics Array) class CCD color cameras (stereo vision system and center camera) | N/A | 46% | [ |
| strawberry | RGB-D camera, 3 IR sensors | 10.6 sec/fruit | 53.6% | [ |
| tomato | Stereo camera, playstation camera | 23 sec/tomato | 60% | [ |
| tomato | Binocular stereo vision system, laser sensor | 15 sec/tomato | 86% | [ |
| cherry tomato | Camera, laser sensor | 8 sec/tomato bunch | 83% | [ |
| cucumber | Two synchronized CCD cameras | 45 sec/cucumber | 80% | [ |
| various fruits | Pressure sensor, 2 convergent IR sensors, telemeter, cameras | 2 sec/fruit (only grasp & detach) | N/A | [ |
| melon | Two black and white CCD cameras, proximity sensor, far and near vision sensors | 15 sec/fruit | 85.67% | [ |
| eggplant | Single CCD camera, photoelectric sensor | 64.1 sec/eggplant | 62.5% | [ |
| watermelon | Two CCD cameras, vacuum sensor | N/A | 66.7% | [ |
Figure 2Global illustration of the reviewed robots’ overall performance.
Plant management agrirobotic systems for various crops.
| Crop | Perception | Operation | Cited Work |
|---|---|---|---|
| Apple tree | Force sensor | pruning | [ |
| Hop | Camera, wire detecting bar | string twining | [ |
| Grape | Three cameras | pruning | [ |
Multi-purpose agrirobotic systems for various crops.
| Working Environment | Perception | Operation(s) | Cited Work |
|---|---|---|---|
| Arable crops | N/A | Ploughing, seeding | [ |
| Arable crops | Sonar sensor, temperature | Monitoring, spraying, fertilization, disease detection | [ |
| Arable crops | Τhree cameras with IR filter, humidity & ultrasonic sensors | Ploughing, seeding, harvesting, spraying | [ |
| Arable crops, polytunnels, greenhouse | 2D LiDAR, ultrasonic sensors, an RGB camera & a monochromatic IR camera | Harrowing, soil sampling, phenotyping, additional tasks by combining modules | [ |
| Arable crops | Soil sensors | Ploughing, irrigation, seeding | [ |
| Greenhouse | Two color cameras, machine vision sensor | Spraying, weeding, additional tasks by adding or removing components | [ |
| Arable crops | Humidity and temperature air sensor, IR sensors | Ploughing, seeding, irrigation, spraying, monitoring | [ |
| Urban crops | Color camera, temperature, humidity and luminosity sensors | Sowing, irrigation, fumigation, pruning | [ |
| Arable crops | Voltage sensors | Seeding, spraying, ploughing, mowing | [ |
| Arable crops | 2D LiDAR, ultrasonic sensor, IR camera | Seeding, weeding, ploughing | [ |
| Arable crops | N/A | Sowing, sprinkling, weeding, harvesting | [ |
| Arable crops | CloverCam, RoboWeedCamRT | Seeding, weeding | [ |
| Vineyards | N/A | Pruning, weeding, mowing | [ |
| Arable crops | N/A | Precision seeding, ridging discs & mechanical row crop cleaning | [ |
Figure 3Representation of various types of agricultural robots ((1) weeding [23], (2) seeding [47], (3) disease and insect detection [55], (4) plant monitoring [69], (5) phenotyping [78], (6) spraying [86], (7) harvesting [100], (8) plant management [152], (9) multi-purpose [137]).
Figure 4Number of reviewed robots per field operation.
Figure 5The main crops in correlation with the number of robotic systems.
Figure 6Allocation (%) of robotic systems in various agricultural environments.