| Literature DB >> 22303146 |
Gabriel J Garcia1, Juan A Corrales, Jorge Pomares, Fernando Torres.
Abstract
Sensors provide robotic systems with the information required to perceive the changes that happen in unstructured environments and modify their actions accordingly. The robotic controllers which process and analyze this sensory information are usually based on three types of sensors (visual, force/torque and tactile) which identify the most widespread robotic control strategies: visual servoing control, force control and tactile control. This paper presents a detailed review on the sensor architectures, algorithmic techniques and applications which have been developed by Spanish researchers in order to implement these mono-sensor and multi-sensor controllers which combine several sensors.Entities:
Keywords: force control; multi-sensor control; robotics; tactile sensors; visual servoing
Year: 2009 PMID: 22303146 PMCID: PMC3267194 DOI: 10.3390/s91209689
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1.“Look and move” scheme.
Figure 2.(a) Stereo eye-to-hand visual servoing configuration. (b) Eye-in-hand visual servoing configuration.
Figure 3.Direct visual servoing scheme.
Figure 4.Position-based visual servoing scheme.
Figure 5.(a) Image acquired from an external camera. (b) Image acquired from the camera mounted on the robot end-effector.
Figure 6.Image-based visual servoing scheme.
Summary of Spanish research on visual servoing.
| [ | Eye-in-hand configuration. | Feedforward neural network. | Industrial inspection. |
| [ | Eye-in-hand configuration. | Reinforcement learning-based neural network. | Grasping of an object on a table. |
| [ | Eye-in-hand configuration | Reinforcement learning-based neural network. | Autonomous submarine for underwater cable inspection tasks |
| [ | Eye-in-hand configuration. | Discrete time Cellular Neural Networks | Test of the proposed visual servoing scheme. |
| [ | Eye-in-hand and eye-to-hand configuration. | Image-based visual servoing. | Visual servoing open architecture. |
| [ | Simulated camera. | - Image-based visual servoing. | Visual servoing simulator in Matlab/Simulink. |
| [ | Eye-in-hand configuration. | Position-based visual servoing with change of the camera-object frame. | Simple tests of the proposed approach. |
| [ | Eye-in-hand mono and stereo rig configuration. | - Image-based visual servoing. | Test of the proposed algorithms. |
| [ | Eye-in-hand stereo rig configuration. | - Stereo image-based visual servoing with 2D point and major axis orientation features. | Testbed for a classic position-based scheme. |
| [ | Eye-in-hand configuration. | Position-based visual servoing. | Testbed for a classic position-based scheme. |
| [ | Eye-in-hand configuration. | - Position-based visual servoing. | Test of the proposed controller. |
| [ | Eye-to-hand configuration. | Position-based visual servoing. | Internet Tele-Lab for learning visual servoing techniques. |
| [ | Eye-in-hand configuration. | Position-based visual servoing. | Testbed for an autonomous satellite repairer. |
| [ | Eye-in-hand configuration. | Position-based direct visual servoing. | Visual control of a 2 degree of freedom robot. |
| [ | Eye-in-hand configuration. | - Position-based direct visual servoing. | RoboTenis: a parallel robot playing table tennis. |
| [ | Eye-in-hand configuration | -Position-based visual servoing based on the scene reconstruction using homographies | Mobile robot navigation |
| [ | Eye-to-hand stereo rig configuration. | Image-based visual servoing from online estimated interaction matrix by using the properties of the epipolar geometry. | Test of the proposed online interaction matrix estimation. |
| [ | Eye-in-hand | Image based visual servoing with a complete interaction matrix without stability problems and using head movements if a stability problem is detected | Aibo visual control to track the ball in a RoboCup soccer match |
| [ | Eye-in-hand configuration. | Image-based visual servoing with online camera calibration. | Test of the proposed algorithm. |
| [ | Eye-in-hand configuration. | Image-based visual servoing solving the visibility problem. | Test of visual servoing tasks with outliers. |
| [ | Eye-to-hand fisheye stereo ring configuration. | Image-based visual servoing with panoramic cameras. | Safety issues for a robot arm moving in close proximity to human beings. |
| [ | Eye-in-hand configuration. | Image-based visual servoing with structured light external visual features. | Plane-to-plane positioning tasks. |
| [ | Eye-in-hand configuration. | Image-based visual servoing based on the homography decomposition. | Simulation of the proposed control scheme. |
| [ | Eye-in-hand configuration | - Switched homography-based visual control | Mobile robot navigation |
| [ | Eye-in-hand configuration | - Image-based visual servoing with epipole geometry as visual features. | Mobile robot navigation |
| [ | Eye-in-hand configuration | Image-based visual servoing with epipoles as visual features | Mobile robot navigation |
| [ | Eye-in-hand configuration | Motion estimator from multiple planar homographies | Vision based control of unmanned aerial vehicles (UAV) |
| [ | Eye-in-hand configuration | - Image-based visual servoing with epipole geometry as visual features. | Mobile robot navigation |
| [ | Eye-in-hand configuration | Tracking of a line with Kalman filter predictor | Autonomous submarine for underwater cable inspection tasks |
| [ | Eye-in-hand camera configuration. | - Image-based visual servoing. | Tracking of predefined paths. |
| [ | Eye-in-hand camera configuration. | - Image-based visual servoing. | Tracking of predefined paths in the change of a fault light bulb. |
| [ | Eye-in-hand stereo rig configuration. | - Stereo image-based visual servoing with stacked-mono and stereo-real interaction matrices. | Test of the proposed algorithms. |
| [ | Eye-in-hand stereo rig configuration. | - Stereo image-based visual servoing with disparity features. | Test of the proposed algorithms. |
| [ | Eye-in-hand stereo rig configuration. | Stereo image-based visual servoing with grasping points features. | Grasping of different objects. |
| [ | Eye-in-hand configuration | - Image-based visual servoing. | Visual servoing of an autonomous helicopter |
| [ | Eye-to-hand configuration. | - Image-based visual servoing. | Automatic chaser car in a slot game. |
| [ | Eye-in-hand configuration. | - Image-based visual servoing. | Peg-in-hole task in motion. |
| [ | Eye-in-hand configuration. | - Image-based visual servoing. | Tests of the proposed motion estimator. |
| [ | Eye-in-hand configuration | - Image-based visual servoing. | Tracking of a desired path in the image. |
| [ | Eye-in-hand configuration | - Image-based visual servoing. | Tracking of a mobile object placed at a turntable. |
Figure 7.(a) Six axis force/torque sensor placed at the robot wrist. (b) Built-in strain gauge in a Barrett hand finger.
Figure 8.Impedance control.
Summary of Spanish research on force control.
| [ | Wrist six axis force/torque sensor. | Neural networks. | Fine motion assembly tasks. |
| [ | Wrist DSP-based six axis force/torque sensor. | Impedance control. | Test of a contact force estimator. |
| [ | Wrist six axis force/torque sensor. | Impedance control. | Test of a self-calibrated contact force estimator. |
| [ | Wrist six axis force/torque sensor. | Hybrid force/motion control. | Bone drilling in a surgical repairing task. |
| [ | Built-in strain gauges. | Proportional pure force control. | Control of a climbing and walking robot. |
| [ | Built-in strain gauge placed at the beginning of the link. | - Switch motion/force control. | Control of free and constrained motion of a flexible robot. |
| [ | Wrist force/torque sensor. | Geometric analytical models. | Fine motion assembly tasks. |
| [ | Built-in strain gauges. | Admittance control with the force controller in the joint space. | Control of a legged-robot. |
| [ | Wrist six axis force/torque sensor. | - Impedance control. | Open software architecture to test robot interaction tasks. |
| [ | Wrist six axis force/torque sensor. | - Different direct force control schemes. | Test architecture for the analysis of the mechanical response in car seats. |
| [ | Wrist six axis force/torque sensor. | Proportional pure force control. | Screwing in an assembly task. |
| [ | Wrist six axis force/torque sensor. | - Impedance control. | Robot humanoid for household furniture common tasks. |
| [ | Wrist six axis force/torque sensor. | Hybrid force/motion control. | Service robot for shaving and feeding tasks. |
| [ | Wrist-mounted strain gauges. | Neural networks | Fine motion assembly tasks. |
Classification of tactile sensors.
| Pressure sensing arrays | - Capacitive | Static (normal pressure) | Extrinsic |
| Skin deflection sensors | - Conductive rubbers | Static (deformation) | |
| Dynamic tactile sensors | - Piezoelectric transducers | Dynamic (vibration, stress) | |
| Fingertip force/torque sensors | - Piezoresistive strain gauges | Static (force/torque) | Intrinsic |
Figure 9.(a) Pressure sensing arrays installed on the fingers and the palm of a Barrett hand; (b) distribution of pressure values registered by the tactile arrays.
Summary of Spanish research on tactile control.
| [ | Built-in force/torque sensor based on strain gauges. | Intrinsic tactile sensing for normal vector computation. | Pipe crawling robot. |
| [ | Fingertip 16×16 conductive rubber contact layer. | Method with 3 phases: noise cancellation, image processing and classification by a LVQ network. | Classification of the local shapes of objects gripped by a robotic hand. |
| [ | - Artificial skin on a parallel robotic gripper. | Neural network organized as a topographic map of joint positions and contact forces. | Grasping of objects of different stiffness with a predefined force. |
| [ | - Three fingertip 8×5 pressure sensing arrays. | Force-pressure control law for controlling the applied force and maximizing the contact surface. | Robotic assistant that picks up books in a library. |
| [ | - Four FSRs. | Control algorithm which detects grasping events from sensor data and generates the user's feedback. | Clinical prosthesis which provides the user with feedback. |
| [ | - 16×16 Array of piezoresistive tactile sensors. | Algorithm implemented in a FPGA which detects slipping according to the number and duration of the digital pulses obtained from the tactile sensors. | Slipping detection alarm for manipulation tasks. |
Figure 10.Example of a robotic task, consisting of grasping a bottle of water, which requires a multi-sensor control strategy.
Figure 11.Visual/force impedance control.
Figure 12.Shared control with an external force loop.
Summary of Spanish research on multi-sensor control.
| [ | - Wrist six axis force/torque sensor. | - Shared visual-force control based on the force-image interaction matrix. | Change of a faulty bulb in a streetlamp. |
| [ | - Wrist six axis force/torque sensor. | - Impedance visual/force control. | Peg-in-hole task in motion. |
| [ | - Wrist six axis force/torque sensor. | - Impedance visual/force control. | Different interaction tasks tracking a desired path. |
| [ | - Wrist six axis force/torque sensor. | - Impedance visual/force control. | Different interaction tasks tracking a desired path in contact with an object. |
| [ | - Wrist six axis force/torque sensor. | Shared visual-force control. | Disassembly task. |
| [ | - Wrist six axis force/torque sensor. | - Shared visual-force control. | Service robot opening a door of a wardrobe. |
| [ | - Wrist six axis force/torque sensor. | - Shared visual-force control. | Different interaction tasks tracking a desired path. |
| [ | - Wrist six axis force/torque sensor. | Visual/force hybrid control. | Library assistant robot. |
| [ | - Stereo-head with 5 d.o.f. | Neural networks with VAM structure which relate visual and tactile data with joint positions. | Reaching and grasping tasks of unknown objects. |
| [ | - Wrist six axis force/torque sensor. | Position-vision-tactile hybrid control modified by an impedance force control. | Service robot which opens a sliding door. |