Literature DB >> 28278455

3-D-Gaze-Based Robotic Grasping Through Mimicking Human Visuomotor Function for People With Motion Impairments.

Songpo Li, Xiaoli Zhang, Jeremy D Webb.   

Abstract

OBJECTIVE: The goal of this paper is to achieve a novel 3-D-gaze-based human-robot-interaction modality, with which a user with motion impairment can intuitively express what tasks he/she wants the robot to do by directly looking at the object of interest in the real world. Toward this goal, we investigate 1) the technology to accurately sense where a person is looking in real environments and 2) the method to interpret the human gaze and convert it into an effective interaction modality. Looking at a specific object reflects what a person is thinking related to that object, and the gaze location contains essential information for object manipulation.
METHODS: A novel gaze vector method is developed to accurately estimate the 3-D coordinates of the object being looked at in real environments, and a novel interpretation framework that mimics human visuomotor functions is designed to increase the control capability of gaze in object grasping tasks.
RESULTS: High tracking accuracy was achieved using the gaze vector method. Participants successfully controlled a robotic arm for object grasping by directly looking at the target object.
CONCLUSION: Human 3-D gaze can be effectively employed as an intuitive interaction modality for robotic object manipulation. SIGNIFICANCE: It is the first time that 3-D gaze is utilized in a real environment to command a robot for a practical application. Three-dimensional gaze tracking is promising as an intuitive alternative for human-robot interaction especially for disabled and elderly people who cannot handle the conventional interaction modalities.

Entities:  

Mesh:

Year:  2017        PMID: 28278455     DOI: 10.1109/TBME.2017.2677902

Source DB:  PubMed          Journal:  IEEE Trans Biomed Eng        ISSN: 0018-9294            Impact factor:   4.538


  8 in total

1.  High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems.

Authors:  Yang Xia; Jiejunyi Liang; Quanlin Li; Peiyang Xin; Ning Zhang
Journal:  Sensors (Basel)       Date:  2022-06-08       Impact factor: 3.847

2.  Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze-Brain Machine Interface.

Authors:  Hong Zeng; Yitao Shen; Xuhui Hu; Aiguo Song; Baoguo Xu; Huijun Li; Yanxin Wang; Pengcheng Wen
Journal:  Front Neurorobot       Date:  2020-01-24       Impact factor: 2.650

3.  Toward Shared Autonomy Control Schemes for Human-Robot Systems: Action Primitive Recognition Using Eye Gaze Features.

Authors:  Xiaoyu Wang; Alireza Haji Fathaliyan; Veronica J Santos
Journal:  Front Neurorobot       Date:  2020-10-15       Impact factor: 2.650

4.  Exploiting Three-Dimensional Gaze Tracking for Action Recognition During Bimanual Manipulation to Enhance Human-Robot Collaboration.

Authors:  Alireza Haji Fathaliyan; Xiaoyu Wang; Veronica J Santos
Journal:  Front Robot AI       Date:  2018-04-04

5.  Gaze-Based Intention Estimation for Shared Autonomy in Pick-and-Place Tasks.

Authors:  Stefan Fuchs; Anna Belardinelli
Journal:  Front Neurorobot       Date:  2021-04-16       Impact factor: 2.650

6.  Face-Computer Interface (FCI): Intent Recognition Based on Facial Electromyography (fEMG) and Online Human-Computer Interface With Audiovisual Feedback.

Authors:  Bo Zhu; Daohui Zhang; Yaqi Chu; Xingang Zhao; Lixin Zhang; Lina Zhao
Journal:  Front Neurorobot       Date:  2021-07-16       Impact factor: 2.650

7.  A Novel Method for Estimating Free Space 3D Point-of-Regard Using Pupillary Reflex and Line-of-Sight Convergence Points.

Authors:  Zijing Wan; Xiangjun Wang; Kai Zhou; Xiaoyun Chen; Xiaoqing Wang
Journal:  Sensors (Basel)       Date:  2018-07-15       Impact factor: 3.576

8.  Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface.

Authors:  Lukas Wöhle; Marion Gebhard
Journal:  Sensors (Basel)       Date:  2021-03-05       Impact factor: 3.576

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.