Literature DB >> 27693470

Head movement compensation and multi-modal event detection in eye-tracking data for unconstrained head movements.

Linnéa Larsson1, Andrea Schwaller2, Marcus Nyström3, Martin Stridh2.   

Abstract

BACKGROUND: The complexity of analyzing eye-tracking signals increases as eye-trackers become more mobile. The signals from a mobile eye-tracker are recorded in relation to the head coordinate system and when the head and body move, the recorded eye-tracking signal is influenced by these movements, which render the subsequent event detection difficult. NEW
METHOD: The purpose of the present paper is to develop a method that performs robust event detection in signals recorded using a mobile eye-tracker. The proposed method performs compensation of head movements recorded using an inertial measurement unit and employs a multi-modal event detection algorithm. The event detection algorithm is based on the head compensated eye-tracking signal combined with information about detected objects extracted from the scene camera of the mobile eye-tracker.
RESULTS: The method is evaluated when participants are seated 2.6m in front of a big screen, and is therefore only valid for distant targets. The proposed method for head compensation decreases the standard deviation during intervals of fixations from 8° to 3.3° for eye-tracking signals recorded during large head movements. COMPARISON WITH EXISTING
METHODS: The multi-modal event detection algorithm outperforms both an existing algorithm (I-VDT) and the built-in-algorithm of the mobile eye-tracker with an average balanced accuracy, calculated over all types of eye movements, of 0.90, compared to 0.85 and 0.75, respectively for the compared algorithms.
CONCLUSIONS: The proposed event detector that combines head movement compensation and information regarding detected objects in the scene video enables for improved classification of events in mobile eye-tracking data.
Copyright © 2016 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Head-tracking; IMU; Mobile eye-tracking; Signal processing; Smooth pursuit

Mesh:

Year:  2016        PMID: 27693470     DOI: 10.1016/j.jneumeth.2016.09.005

Source DB:  PubMed          Journal:  J Neurosci Methods        ISSN: 0165-0270            Impact factor:   2.390


  6 in total

1.  Evaluating Eye Movement Event Detection: A Review of the State of the Art.

Authors:  Mikhail Startsev; Raimondas Zemblys
Journal:  Behav Res Methods       Date:  2022-06-17

2.  Comparison of visual SLAM and IMU in tracking head movement outdoors.

Authors:  Ayush Kumar; Shrinivas Pundlik; Eli Peli; Gang Luo
Journal:  Behav Res Methods       Date:  2022-08-11

3.  Gaze-in-wild: A dataset for studying eye and head coordination in everyday activities.

Authors:  Christopher Kanan; Reynold Bailey; Jeff B Pelz; Gabriel J Diaz; Rakshit Kothari; Zhizhuo Yang
Journal:  Sci Rep       Date:  2020-02-13       Impact factor: 4.379

4.  GlassesViewer: Open-source software for viewing and analyzing data from the Tobii Pro Glasses 2 eye tracker.

Authors:  Diederick C Niehorster; Roy S Hessels; Jeroen S Benjamins
Journal:  Behav Res Methods       Date:  2020-06

5.  Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers.

Authors:  Roy S Hessels; Diederick C Niehorster; Marcus Nyström; Richard Andersson; Ignace T C Hooge
Journal:  R Soc Open Sci       Date:  2018-08-29       Impact factor: 2.963

6.  What are the visuo-motor tendencies of omnidirectional scene free-viewing in virtual reality?

Authors:  Erwan Joël David; Pierre Lebranchu; Matthieu Perreira Da Silva; Patrick Le Callet
Journal:  J Vis       Date:  2022-03-02       Impact factor: 2.240

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.