Literature DB >> 29060795

Real-time estimation of eye gaze by in-ear electrodes.

A Favre-Felix, C Graversen, T Dau, T Lunner.   

Abstract

Cognitive control of a hearing aid is the topic for several ongoing studies. The relevance of these studies should be seen in the light of inadequate steering of current hearing aids. While most studies are concerned with auditory attention tracking from the electroencephalogram (EEG), a complimentary approach may be to use visual attention tracking to steer the devices. Visual attention may be characterized by gaze direction, which can be obtained by electrooculography (EOG). EOG may be recorded from electrodes placed in the ear canal, termed EarEOG. To test the comparison of conventional EOG and EarEOG recordings, we conducted two experiments with six subjects. In the first experiment, the subjects were instructed to follow a moving dot on the screen moving in large saccades. In the second experiment, there were five large targets, and within each target, the dot had minor movements. When comparing conventional EOG and EarEOG, correlations of 0.9 and 0.91 with standard deviations of 0.02 were obtained for the two experiments respectively. To assess the feasibility of using EarEOG in real-time, correlation between EarEOG and the timecourse of the dot position was performed. When both signals were filtered with the same real-time applicable filter, correlations of 0.83 and 0.85 with standard deviations of 0.09 and 0.05 were found respectively to the two experiments. In conclusion, this study provides motivational aspects of using EarEOG to estimate eye gaze, as well as it identifies important future challenges in real-time applications to steer external devices such as a hearing aid.

Entities:  

Mesh:

Year:  2017        PMID: 29060795     DOI: 10.1109/EMBC.2017.8037754

Source DB:  PubMed          Journal:  Conf Proc IEEE Eng Med Biol Soc        ISSN: 1557-170X


  5 in total

1.  Comparing In-ear EOG for Eye-Movement Estimation With Eye-Tracking: Accuracy, Calibration, and Speech Comprehension.

Authors:  Martin A Skoglund; Martin Andersen; Martha M Shiell; Gitte Keidser; Mike Lind Rank; Sergi Rotger-Griful
Journal:  Front Neurosci       Date:  2022-06-30       Impact factor: 5.152

2.  Towards mobile gaze-directed beamforming: a novel neuro-technology for hearing loss.

Authors:  Markham H Anderson; Britt W Yazel; Matthew P F Stickle; Fernando D Espinosa Inguez; Nathaniel-Georg S Gutierrez; Malcolm Slaney; Sanjay S Joshi; Lee M Miller
Journal:  Annu Int Conf IEEE Eng Med Biol Soc       Date:  2018-07

3.  Evaluation of the Influence of Head Movement on Hearing Aid Algorithm Performance Using Acoustic Simulations.

Authors:  Maartje M E Hendrikse; Giso Grimm; Volker Hohmann
Journal:  Trends Hear       Date:  2020 Jan-Dec       Impact factor: 3.293

4.  Movement and Gaze Behavior in Virtual Audiovisual Listening Environments Resembling Everyday Life.

Authors:  Maartje M E Hendrikse; Gerard Llorach; Volker Hohmann; Giso Grimm
Journal:  Trends Hear       Date:  2019 Jan-Dec       Impact factor: 3.293

5.  The Dynamics of Attention Shifts Among Concurrent Speech in a Naturalistic Multi-speaker Virtual Environment.

Authors:  Keren Shavit-Cohen; Elana Zion Golumbic
Journal:  Front Hum Neurosci       Date:  2019-11-08       Impact factor: 3.169

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.