Linnéa Larsson1, Andrea Schwaller2, Marcus Nyström3, Martin Stridh2. 1. Department of Biomedical Engineering, Lund University, Lund, Box 118, 221 00 Lund, Sweden. Electronic address: linnea.larsson@bme.lth.se. 2. Department of Biomedical Engineering, Lund University, Lund, Box 118, 221 00 Lund, Sweden. 3. Lund University Humanities Laboratory, Lund, Helgonabacken 12, 223 62 Lund, Sweden.
Abstract
BACKGROUND: The complexity of analyzing eye-tracking signals increases as eye-trackers become more mobile. The signals from a mobile eye-tracker are recorded in relation to the head coordinate system and when the head and body move, the recorded eye-tracking signal is influenced by these movements, which render the subsequent event detection difficult. NEW METHOD: The purpose of the present paper is to develop a method that performs robust event detection in signals recorded using a mobile eye-tracker. The proposed method performs compensation of head movements recorded using an inertial measurement unit and employs a multi-modal event detection algorithm. The event detection algorithm is based on the head compensated eye-tracking signal combined with information about detected objects extracted from the scene camera of the mobile eye-tracker. RESULTS: The method is evaluated when participants are seated 2.6m in front of a big screen, and is therefore only valid for distant targets. The proposed method for head compensation decreases the standard deviation during intervals of fixations from 8° to 3.3° for eye-tracking signals recorded during large head movements. COMPARISON WITH EXISTING METHODS: The multi-modal event detection algorithm outperforms both an existing algorithm (I-VDT) and the built-in-algorithm of the mobile eye-tracker with an average balanced accuracy, calculated over all types of eye movements, of 0.90, compared to 0.85 and 0.75, respectively for the compared algorithms. CONCLUSIONS: The proposed event detector that combines head movement compensation and information regarding detected objects in the scene video enables for improved classification of events in mobile eye-tracking data.
BACKGROUND: The complexity of analyzing eye-tracking signals increases as eye-trackers become more mobile. The signals from a mobile eye-tracker are recorded in relation to the head coordinate system and when the head and body move, the recorded eye-tracking signal is influenced by these movements, which render the subsequent event detection difficult. NEW METHOD: The purpose of the present paper is to develop a method that performs robust event detection in signals recorded using a mobile eye-tracker. The proposed method performs compensation of head movements recorded using an inertial measurement unit and employs a multi-modal event detection algorithm. The event detection algorithm is based on the head compensated eye-tracking signal combined with information about detected objects extracted from the scene camera of the mobile eye-tracker. RESULTS: The method is evaluated when participants are seated 2.6m in front of a big screen, and is therefore only valid for distant targets. The proposed method for head compensation decreases the standard deviation during intervals of fixations from 8° to 3.3° for eye-tracking signals recorded during large head movements. COMPARISON WITH EXISTING METHODS: The multi-modal event detection algorithm outperforms both an existing algorithm (I-VDT) and the built-in-algorithm of the mobile eye-tracker with an average balanced accuracy, calculated over all types of eye movements, of 0.90, compared to 0.85 and 0.75, respectively for the compared algorithms. CONCLUSIONS: The proposed event detector that combines head movement compensation and information regarding detected objects in the scene video enables for improved classification of events in mobile eye-tracking data.
Authors: Christopher Kanan; Reynold Bailey; Jeff B Pelz; Gabriel J Diaz; Rakshit Kothari; Zhizhuo Yang Journal: Sci Rep Date: 2020-02-13 Impact factor: 4.379
Authors: Roy S Hessels; Diederick C Niehorster; Marcus Nyström; Richard Andersson; Ignace T C Hooge Journal: R Soc Open Sci Date: 2018-08-29 Impact factor: 2.963