Literature DB >> 20580646

A brain-computer interface method combined with eye tracking for 3D interaction.

Eui Chul Lee1, Jin Cheol Woo, Jong Hwa Kim, Mincheol Whang, Kang Ryoung Park.   

Abstract

With the recent increase in the number of three-dimensional (3D) applications, the need for interfaces to these applications has increased. Although the eye tracking method has been widely used as an interaction interface for hand-disabled persons, this approach cannot be used for depth directional navigation. To solve this problem, we propose a new brain computer interface (BCI) method in which the BCI and eye tracking are combined to analyze depth navigation, including selection and two-dimensional (2D) gaze direction, respectively. The proposed method is novel in the following five ways compared to previous works. First, a device to measure both the gaze direction and an electroencephalogram (EEG) pattern is proposed with the sensors needed to measure the EEG attached to a head-mounted eye tracking device. Second, the reliability of the BCI interface is verified by demonstrating that there is no difference between the real and the imaginary movements for the same work in terms of the EEG power spectrum. Third, depth control for the 3D interaction interface is implemented by an imaginary arm reaching movement. Fourth, a selection method is implemented by an imaginary hand grabbing movement. Finally, for the independent operation of gazing and the BCI, a mode selection method is proposed that measures a user's concentration by analyzing the pupil accommodation speed, which is not affected by the operation of gazing and the BCI. According to experimental results, we confirmed the feasibility of the proposed 3D interaction method using eye tracking and a BCI. Copyright 2010 Elsevier B.V. All rights reserved.

Entities:  

Mesh:

Year:  2010        PMID: 20580646     DOI: 10.1016/j.jneumeth.2010.05.008

Source DB:  PubMed          Journal:  J Neurosci Methods        ISSN: 0165-0270            Impact factor:   2.390


  5 in total

1.  Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic.

Authors:  David P McMullen; Guy Hotson; Kapil D Katyal; Brock A Wester; Matthew S Fifer; Timothy G McGee; Andrew Harris; Matthew S Johannes; R Jacob Vogelstein; Alan D Ravitz; William S Anderson; Nitish V Thakor; Nathan E Crone
Journal:  IEEE Trans Neural Syst Rehabil Eng       Date:  2013-12-12       Impact factor: 3.802

2.  MUNDUS project: MUltimodal neuroprosthesis for daily upper limb support.

Authors:  Alessandra Pedrocchi; Simona Ferrante; Emilia Ambrosini; Marta Gandolla; Claudia Casellato; Thomas Schauer; Christian Klauer; Javier Pascual; Carmen Vidaurre; Margit Gföhler; Werner Reichenfelser; Jakob Karner; Silvestro Micera; Andrea Crema; Franco Molteni; Mauro Rossini; Giovanna Palumbo; Eleonora Guanziroli; Andreas Jedlitschka; Marco Hack; Maria Bulgheroni; Enrico d'Amico; Peter Schenk; Sven Zwicker; Alexander Duschau-Wicke; Justinas Miseikis; Lina Graber; Giancarlo Ferrigno
Journal:  J Neuroeng Rehabil       Date:  2013-07-03       Impact factor: 4.262

Review 3.  A systematic review of hybrid brain-computer interfaces: Taxonomy and usability perspectives.

Authors:  Inchul Choi; Ilsun Rhiu; Yushin Lee; Myung Hwan Yun; Chang S Nam
Journal:  PLoS One       Date:  2017-04-28       Impact factor: 3.240

4.  Visual-spatial dimension integration in digital pathology education enhances anatomical pathology learning.

Authors:  Ken Lee Wan; Arkendu Sen; Lakshmi Selvaratnam; Mohd Imran Mohd Naing; Joon Joon Khoo; Pathmanathan Rajadurai
Journal:  BMC Med Educ       Date:  2022-07-30       Impact factor: 3.263

5.  A conceptual space for EEG-based brain-computer interfaces.

Authors:  Nataliya Kosmyna; Anatole Lécuyer
Journal:  PLoS One       Date:  2019-01-03       Impact factor: 3.240

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.