Literature DB >> 33807599

Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface.

Lukas Wöhle1, Marion Gebhard1.   

Abstract

This paper presents a lightweight, infrastructureless head-worn interface for robust and real-time robot control in Cartesian space using head- and eye-gaze. The interface comes at a total weight of just 162 g. It combines a state-of-the-art visual simultaneous localization and mapping algorithm (ORB-SLAM 2) for RGB-D cameras with a Magnetic Angular rate Gravity (MARG)-sensor filter. The data fusion process is designed to dynamically switch between magnetic, inertial and visual heading sources to enable robust orientation estimation under various disturbances, e.g., magnetic disturbances or degraded visual sensor data. The interface furthermore delivers accurate eye- and head-gaze vectors to enable precise robot end effector (EFF) positioning and employs a head motion mapping technique to effectively control the robots end effector orientation. An experimental proof of concept demonstrates that the proposed interface and its data fusion process generate reliable and robust pose estimation. The three-dimensional head- and eye-gaze position estimation pipeline delivers a mean Euclidean error of 19.0±15.7 mm for head-gaze and 27.4±21.8 mm for eye-gaze at a distance of 0.3-1.1 m to the user. This indicates that the proposed interface offers a precise control mechanism for hands-free and full six degree of freedom (DoF) robot teleoperation in Cartesian space by head- or eye-gaze and head motion.

Entities:  

Keywords:  MARG-sensors; data fusion; gaze control; hands-free interface; human robot collaboration; multisensory interface; pose estimation; robot control in cartesian space

Mesh:

Year:  2021        PMID: 33807599      PMCID: PMC7962065          DOI: 10.3390/s21051798

Source DB:  PubMed          Journal:  Sensors (Basel)        ISSN: 1424-8220            Impact factor:   3.576


  7 in total

1.  Estimation of IMU and MARG orientation using a gradient descent algorithm.

Authors:  Sebastian O H Madgwick; Andrew J L Harrison; Andrew Vaidyanathan
Journal:  IEEE Int Conf Rehabil Robot       Date:  2011

2.  Effect of local magnetic field disturbances on inertial measurement units accuracy.

Authors:  Xavier Robert-Lachaine; Hakim Mecheri; Christian Larue; André Plamondon
Journal:  Appl Ergon       Date:  2017-04-26       Impact factor: 3.661

3.  3-D-Gaze-Based Robotic Grasping Through Mimicking Human Visuomotor Function for People With Motion Impairments.

Authors:  Songpo Li; Xiaoli Zhang; Jeremy D Webb
Journal:  IEEE Trans Biomed Eng       Date:  2017-03-03       Impact factor: 4.538

4.  Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking.

Authors:  Roni O Maimon-Dror; Jorge Fernandez-Quesada; Giuseppe A Zito; Charalambos Konnaris; Sabine Dziemian; A Aldo Faisal
Journal:  IEEE Int Conf Rehabil Robot       Date:  2017-07

5.  Head Motion and Head Gesture-Based Robot Control: A Usability Study.

Authors:  Anja Jackowski; Marion Gebhard; Roland Thietje
Journal:  IEEE Trans Neural Syst Rehabil Eng       Date:  2018-01       Impact factor: 3.802

6.  AMiCUS-A Head Motion-Based Interface for Control of an Assistive Robot.

Authors:  Nina Rudigkeit; Marion Gebhard
Journal:  Sensors (Basel)       Date:  2019-06-25       Impact factor: 3.576

7.  SteadEye-Head-Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data.

Authors:  Lukas Wöhle; Marion Gebhard
Journal:  Sensors (Basel)       Date:  2020-05-12       Impact factor: 3.576

  7 in total
  1 in total

1.  Restoration of complex movement in the paralyzed upper limb.

Authors:  Brady A Hasse; Drew E G Sheets; Nicole L Holly; Katalin M Gothard; Andrew J Fuglevand
Journal:  J Neural Eng       Date:  2022-07-01       Impact factor: 5.043

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.