Literature DB >> 20160299

An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data.

Marcus Nyström1, Kenneth Holmqvist.   

Abstract

Event detection is used to classify recorded gaze points into periods of fixation, saccade, smooth pursuit, blink, and noise. Although there is an overall consensus that current algorithms for event detection have serious flaws and that a de facto standard for event detection does not exist, surprisingly little work has been done to remedy this problem. We suggest a new velocity-based algorithm that takes several of the previously known limitations into account. Most important, the new algorithm identifies so-called glissades, a wobbling movement at the end of many saccades, as a separate class of eye movements. Part of the solution involves designing an adaptive velocity threshold that makes the event detection less sensitive to variations in noise level and the algorithm settings-free for the user. We demonstrate the performance of the new algorithm on eye movements recorded during reading and scene perception and compare it with two of the most commonly used algorithms today. Results show that, unlike the currently used algorithms, fixations, saccades, and glissades are robustly identified by the new algorithm. Using this algorithm, we found that glissades occur in about half of the saccades, during both reading and scene perception, and that they have an average duration close to 24 msec. Due to the high prevalence and long durations of glissades, we argue that researchers must actively choose whether to assign the glissades to saccades or fixations; the choice affects dependent variables such as fixation and saccade duration significantly. Current algorithms do not offer this choice, and their assignments of each glissade are largely arbitrary.

Entities:  

Mesh:

Year:  2010        PMID: 20160299     DOI: 10.3758/BRM.42.1.188

Source DB:  PubMed          Journal:  Behav Res Methods        ISSN: 1554-351X


  106 in total

1.  Manual tracking enhances smooth pursuit eye movements.

Authors:  Diederick C Niehorster; Wilfred W F Siu; Li Li
Journal:  J Vis       Date:  2015       Impact factor: 2.240

2.  REMoDNaV: robust eye-movement classification for dynamic stimulation.

Authors:  Asim H Dar; Adina S Wagner; Michael Hanke
Journal:  Behav Res Methods       Date:  2021-02

3.  Defining eye-fixation sequences across individuals and tasks: the Binocular-Individual Threshold (BIT) algorithm.

Authors:  Ralf van der Lans; Michel Wedel; Rik Pieters
Journal:  Behav Res Methods       Date:  2011-03

4.  Real-time recording and classification of eye movements in an immersive virtual environment.

Authors:  Gabriel Diaz; Joseph Cooper; Dmitry Kit; Mary Hayhoe
Journal:  J Vis       Date:  2013-10-10       Impact factor: 2.240

5.  Attending to What and Where: Background Connectivity Integrates Categorical and Spatial Attention.

Authors:  Alexa Tompary; Naseem Al-Aidroos; Nicholas B Turk-Browne
Journal:  J Cogn Neurosci       Date:  2018-05-23       Impact factor: 3.225

6.  Learning What Is Irrelevant or Relevant: Expectations Facilitate Distractor Inhibition and Target Facilitation through Distinct Neural Mechanisms.

Authors:  Dirk van Moorselaar; Heleen A Slagter
Journal:  J Neurosci       Date:  2019-07-03       Impact factor: 6.167

7.  Neural Representations of Faces Are Tuned to Eye Movements.

Authors:  Lisa Stacchi; Meike Ramon; Junpeng Lao; Roberto Caldara
Journal:  J Neurosci       Date:  2019-03-13       Impact factor: 6.167

8.  Parietal neurons encode expected gains in instrumental information.

Authors:  Nicholas C Foley; Simon P Kelly; Himanshu Mhatre; Manuel Lopes; Jacqueline Gottlieb
Journal:  Proc Natl Acad Sci U S A       Date:  2017-04-03       Impact factor: 11.205

9.  Functional consequences of oculomotor disorders in hereditary cerebellar ataxias.

Authors:  M F Alexandre; S Rivaud-Péchoux; G Challe; A Durr; B Gaymard
Journal:  Cerebellum       Date:  2013-06       Impact factor: 3.847

10.  Contributions of head-mounted cameras to studying the visual environments of infants and young children.

Authors:  Linda Smith; Chen Yu; Hanako Yoshida; Caitlin M Fausey
Journal:  J Cogn Dev       Date:  2015
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.