Literature DB >> 25398172

A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: application to robot control.

Jiaxin Ma, Yu Zhang, Andrzej Cichocki, Fumitoshi Matsuno.   

Abstract

This study presents a novel human-machine interface (HMI) based on both electrooculography (EOG) and electroencephalography (EEG). This hybrid interface works in two modes: an EOG mode recognizes eye movements such as blinks, and an EEG mode detects event related potentials (ERPs) like P300. While both eye movements and ERPs have been separately used for implementing assistive interfaces, which help patients with motor disabilities in performing daily tasks, the proposed hybrid interface integrates them together. In this way, both the eye movements and ERPs complement each other. Therefore, it can provide a better efficiency and a wider scope of application. In this study, we design a threshold algorithm that can recognize four kinds of eye movements including blink, wink, gaze, and frown. In addition, an oddball paradigm with stimuli of inverted faces is used to evoke multiple ERP components including P300, N170, and VPP. To verify the effectiveness of the proposed system, two different online experiments are carried out. One is to control a multifunctional humanoid robot, and the other is to control four mobile robots. In both experiments, the subjects can complete tasks effectively by using the proposed interface, whereas the best completion time is relatively short and very close to the one operated by hand.

Entities:  

Mesh:

Year:  2014        PMID: 25398172     DOI: 10.1109/TBME.2014.2369483

Source DB:  PubMed          Journal:  IEEE Trans Biomed Eng        ISSN: 0018-9294            Impact factor:   4.538


  22 in total

Review 1.  EOG-Based Human-Computer Interface: 2000-2020 Review.

Authors:  Chama Belkhiria; Atlal Boudir; Christophe Hurter; Vsevolod Peysakhovich
Journal:  Sensors (Basel)       Date:  2022-06-29       Impact factor: 3.847

Review 2.  EEG-EOG based Virtual Keyboard: Toward Hybrid Brain Computer Interface.

Authors:  Sarah M Hosni; Howida A Shedeed; Mai S Mabrouk; Mohamed F Tolba
Journal:  Neuroinformatics       Date:  2019-07

3.  Closed-Loop Hybrid Gaze Brain-Machine Interface Based Robotic Arm Control with Augmented Reality Feedback.

Authors:  Hong Zeng; Yanxin Wang; Changcheng Wu; Aiguo Song; Jia Liu; Peng Ji; Baoguo Xu; Lifeng Zhu; Huijun Li; Pengcheng Wen
Journal:  Front Neurorobot       Date:  2017-10-31       Impact factor: 2.650

4.  The Role of Visual Noise in Influencing Mental Load and Fatigue in a Steady-State Motion Visual Evoked Potential-Based Brain-Computer Interface.

Authors:  Jun Xie; Guanghua Xu; Ailing Luo; Min Li; Sicong Zhang; Chengcheng Han; Wenqiang Yan
Journal:  Sensors (Basel)       Date:  2017-08-14       Impact factor: 3.576

Review 5.  Hybrid Brain-Computer Interface Techniques for Improved Classification Accuracy and Increased Number of Commands: A Review.

Authors:  Keum-Shik Hong; Muhammad Jawad Khan
Journal:  Front Neurorobot       Date:  2017-07-24       Impact factor: 2.650

Review 6.  A systematic review of hybrid brain-computer interfaces: Taxonomy and usability perspectives.

Authors:  Inchul Choi; Ilsun Rhiu; Yushin Lee; Myung Hwan Yun; Chang S Nam
Journal:  PLoS One       Date:  2017-04-28       Impact factor: 3.240

7.  Hybrid EEG-fNIRS-Based Eight-Command Decoding for BCI: Application to Quadcopter Control.

Authors:  Muhammad Jawad Khan; Keum-Shik Hong
Journal:  Front Neurorobot       Date:  2017-02-17       Impact factor: 2.650

Review 8.  The Human Factors and Ergonomics of P300-Based Brain-Computer Interfaces.

Authors:  J Clark Powers; Kateryna Bieliaieva; Shuohao Wu; Chang S Nam
Journal:  Brain Sci       Date:  2015-08-10

Review 9.  Brain-Computer Interface Spellers: A Review.

Authors:  Aya Rezeika; Mihaly Benda; Piotr Stawicki; Felix Gembler; Abdul Saboor; Ivan Volosyak
Journal:  Brain Sci       Date:  2018-03-30

10.  Development of an electrooculogram-based human-computer interface using involuntary eye movement by spatially rotating sound for communication of locked-in patients.

Authors:  Do Yeon Kim; Chang-Hee Han; Chang-Hwan Im
Journal:  Sci Rep       Date:  2018-06-22       Impact factor: 4.379

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.