| Literature DB >> 25143976 |
Luis Emmi1, Mariano Gonzalez-de-Soto1, Gonzalo Pajares2, Pablo Gonzalez-de-Santos1.
Abstract
Computer-based sensors and actuators such as global positioning systems, machine vision, and laser-based sensors have progressively been incorporated into mobile robots with the aim of configuring autonomous systems capable of shifting operator activities in agricultural tasks. However, the incorporation of many electronic systems into a robot impairs its reliability and increases its cost. Hardware minimization, as well as software minimization and ease of integration, is essential to obtain feasible robotic systems. A step forward in the application of automatic equipment in agriculture is the use of fleets of robots, in which a number of specialized robots collaborate to accomplish one or several agricultural tasks. This paper strives to develop a system architecture for both individual robots and robots working in fleets to improve reliability, decrease complexity and costs, and permit the integration of software from different developers. Several solutions are studied, from a fully distributed to a whole integrated architecture in which a central computer runs all processes. This work also studies diverse topologies for controlling fleets of robots and advances other prospective topologies. The architecture presented in this paper is being successfully applied in the RHEA fleet, which comprises three ground mobile units based on a commercial tractor chassis.Entities:
Mesh:
Year: 2014 PMID: 25143976 PMCID: PMC3985338 DOI: 10.1155/2014/404059
Source DB: PubMed Journal: ScientificWorldJournal ISSN: 1537-744X
Figure 1Main systems comprising a current autonomous agricultural application and some examples of sensor and actuation systems normally found in this type of application.
Examples of autonomous vehicles for agricultural applications developed around the world.
| Author/Centre | Blackmore et al., 2004. Dept. of Agricultural Sciences, Frederiksberg, Denmark [ |
| Application | Automatic steered tractor capable of following a predefined route plan |
| Sensorial System | RTK GPS: Localization. |
| Results | The automatic steered tractor can follow a predetermined route within a few centimeters |
|
| |
| Author/Centre | Cho and Lee, 2000. Department of Agricultural Engineering, Seoul National University, Korea [ |
| Application | Autonomous operation of a speedsprayer in an orchard (a speedsprayer is defined as a power sprayer used |
| Sensorial System | DGPS for Localization; ultrasonic sensor for obstacle detection |
| Results | Speedsprayer autonomous operation: within 50-cm deviation. The speedsprayer could avoid trees |
|
| |
| Author/Centre | Hague et al., 2000. Silsoe Research Institute, Wrest Park, UK [ |
| Application | Ground-based sensing methods for vehicle-position fixing |
| Sensorial System | Sensor package: machine vision, odometers, accelerometers, and a compass |
| Results | Reducing the low noise level of the odometric data and eliminating drift using sensor fusion |
|
| |
| Author/Centre | Subramanian et al., 2006. Department of Agricultural and Biological Engineering, University of Florida, USA [ |
| Application | Autonomous guidance system for use in a citrus grove |
| Sensorial System | Machine vision and laser radar (LADAR) |
| Results | Machine vision guidance: average error of 2.8 cm. LADAR guidance: average error of 2.5 cm |
|
| |
| Author/Centre | Xue et al., 2012. Department of Agricultural and Biological Engineering, University of Illinois, USA [ |
| Application | Variable field-of-view machine vision method for agricultural robot navigation between rows in cornfields |
| Sensorial System | Machine vision with pitch and yaw motion control |
| Results | Maximum guidance error of 15.8 mm and stable navigational behavior |
Examples of autonomous implements for agricultural applications developed around the world.
| Author/Centre | Blasco et al., 2002. Instituto Valenciano de Investigaciones Agrarias (IVIA), Spain [ |
| Application | Non-chemical weed controller for vegetable crops |
| Sensorial System | Two machine vision systems: one in front of the robot for weed detection; the other for correcting inertial perturbations |
| Results | The system was able to eliminate 100% of small weeds. The system properly located 84% of weeds and 99% of lettuces |
|
| |
| Author/Centre | Lee et al., 1999. Biological and Agricultural Engineering, University of California, USA [ |
| Application | Real-time intelligent robotic weed control system for selective herbicide application to in-row weeds |
| Sensorial System | Two machine vision systems: one in front of the robot for guidance; the other for weed detection |
| Results | 24.2% of the tomatoes were incorrectly identified and sprayed, and 52.4% of the weeds were not sprayed |
|
| |
| Author/Centre | Leemans and Destain, 2007. Gembloux Agricultural University, Belgium [ |
| Application | Positioning seed drills relative to the previous lines while sowing |
| Sensorial System | Machine vision for guidance |
| Results | The standard deviation of the error was 23 mm, with a range of less than 100 mm |
|
| |
| Author/Centre | Pérez-Ruiz et al., 2012. University of California, Davis, Department of Biological and Agricultural Engineering, USA [ |
| Application | Automatic mechanical intra-row weed control for transplanted row crops |
| Sensorial System | RTK-GPS for controlling the path of a pair of intra-row weed knives |
| Results | A mean error of 0.8 cm in centering the actual uncultivated close-to-crop zone about the tomato main stems, with standard deviations of 1.75 and 3.28 cm at speeds of 0.8 and 1.6 km/h, respectively |
Figure 2General frameworks of a fully autonomous crop operation. (a) Basic elements of agricultural vehicle guidance systems. (b) Basic elements of autonomous implements.
Figure 3The RHEA fleet (ground mobile units and implements).
Figure 4(a) Initial commercial tractor, (b) final RHEA mobile unit, (c) external equipment onboard the mobile units, and (d) internal equipment distribution inside the mobile unit's cabin.
Figure 5Implements controlled by the RHEA system: (a) boom sprayer, (b) flame hoe, and (c) canopy sprayer.
Figure 6General schema of the fleet of robot topology for the RHEA project.
Figure 7General scheme of the hardware architecture for the autonomous mobile robot in the RHEA project.
Figure 8Comparison of the distributed approach (a) and the centralized approach (b) in the Weed Detection System regarding the use of resources, information availability, and communication time.
Figure 9Example of the procedure for calling external code in LabVIEW using DLLs.
Figure 10Prospective hardware configuration of the RHEA system.
Figure 11General diagram of the high-level decision-making system indicating three levels of main subsystems, their outputs, and their interactions with other subsystems. (a) Principal outputs (green boxes) of the lower subsystems. (b) Flow between sensory systems and control systems and navigation process execution.
Figure 12Master-slave configuration.
Figure 13Immerse configuration.
Figure 14GPS position recorded for each unit representing the mission execution. The black circles represent the origin point of each unit.
Figure 15Snapshots of the occupancy grid mapping for collision detection. (a) Results of the collision detection within 10 seconds of the execution time of the mission. (b) Results of the collision detection within 25 seconds of the execution time of the mission.
Figure 16Accumulated distance traveled by each fleet unit as a function of the mission time.
(a)
| Time required | Image Acquisition | Fps acquired | Image Processing | Fps processed | Image Sharing | Other process running |
|---|---|---|---|---|---|---|
| Original structure | 75–150 ms | 5 | 150–250 ms | 4 | 150–200 ms | 0 |
| Proposed structure | 80–160 ms | 5 | 200–250 ms | 4 | 1 ms | 4 |
(b)
| Other process running | Scheduled periods |
|---|---|
| Path following supervising routine | 100 ms |
| Steering and throttle control routine | 10 ms |
| Telemetry routine | 100 ms |
| Localization routine | 100 ms |