Literature DB >> 19009467

Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability.

Anaelis Sesin1, Malek Adjouadi, Mercedes Cabrerizo, Melvin Ayala, Armando Barreto.   

Abstract

This study developed an adaptive real-time human-computer interface (HCI) that serves as an assistive technology tool for people with severe motor disability. The proposed HCI design uses eye gaze as the primary computer input device. Controlling the mouse cursor with raw eye coordinates results in sporadic motion of the pointer because of the saccadic nature of the eye. Even though eye movements are subtle and completely imperceptible under normal circumstances, they considerably affect the accuracy of an eye-gaze-based HCI. The proposed HCI system is novel because it adapts to each specific user's different and potentially changing jitter characteristics through the configuration and training of an artificial neural network (ANN) that is structured to minimize the mouse jitter. This task is based on feeding the ANN a user's initially recorded eye-gaze behavior through a short training session. The ANN finds the relationship between the gaze coordinates and the mouse cursor position based on the multilayer perceptron model. An embedded graphical interface is used during the training session to generate user profiles that make up these unique ANN configurations. The results with 12 subjects in test 1, which involved following a moving target, showed an average jitter reduction of 35%; the results with 9 subjects in test 2, which involved following the contour of a square object, showed an average jitter reduction of 53%. For both results, the outcomes led to trajectories that were significantly smoother and apt at reaching fixed or moving targets with relative ease and within a 5% error margin or deviation from desired trajectories. The positive effects of such jitter reduction are presented graphically for visual appreciation.

Entities:  

Mesh:

Year:  2008        PMID: 19009467     DOI: 10.1682/jrrd.2007.05.0075

Source DB:  PubMed          Journal:  J Rehabil Res Dev        ISSN: 0748-7711


  5 in total

1.  Nonwearable gaze tracking system for controlling home appliance.

Authors:  Hwan Heo; Jong Man Lee; Dongwook Jung; Ji Woo Lee; Kang Ryoung Park
Journal:  ScientificWorldJournal       Date:  2014-09-14

Review 2.  When I Look into Your Eyes: A Survey on Computer Vision Contributions for Human Gaze Estimation and Tracking.

Authors:  Dario Cazzato; Marco Leo; Cosimo Distante; Holger Voos
Journal:  Sensors (Basel)       Date:  2020-07-03       Impact factor: 3.576

3.  Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral Vision.

Authors:  Yafeng Niu; Jingze Tian; Zijian Han; Mengyuan Qu; Mu Tong; Wenjun Yang; Chengqi Xue
Journal:  Int J Environ Res Public Health       Date:  2022-08-29       Impact factor: 4.614

4.  A free geometry model-independent neural eye-gaze tracking system.

Authors:  Massimo Gneo; Maurizio Schmid; Silvia Conforto; Tommaso D'Alessio
Journal:  J Neuroeng Rehabil       Date:  2012-11-16       Impact factor: 4.262

5.  A Novel Method for Estimating Free Space 3D Point-of-Regard Using Pupillary Reflex and Line-of-Sight Convergence Points.

Authors:  Zijing Wan; Xiangjun Wang; Kai Zhou; Xiaoyun Chen; Xiaoqing Wang
Journal:  Sensors (Basel)       Date:  2018-07-15       Impact factor: 3.576

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.