Literature DB >> 12564562

Cleaning up systematic error in eye-tracking data by using required fixation locations.

Anthony J Hornof1, Tim Halverson.   

Abstract

In the course of running an eye-tracking experiment, one computer system or subsystem typically presents the stimuli to the participant and records manual responses, and another collects the eye movement data, with little interaction between the two during the course of the experiment. This article demonstrates how the two systems can interact with each other to facilitate a richer set of experimental designs and applications and to produce more accurate eye tracking data. In an eye-tracking study, a participant is periodically instructed to look at specific screen locations, or explicit required fixation locations (RFLs), in order to calibrate the eye tracker to the participant. The design of an experimental procedure will also often produce a number of implicit RFIs--screen locations that the participant must look at within a certain window of time or at a certain moment in order to successfully and correctly accomplish a task, but without explicit instructions to fixate those locations. In these windows of time or at these moments, the disparity between the fixations recorded by the eye tracker and the screen locations corresponding to implicit RFLs can be examined, and the results of the comparison can be used for a variety of purposes. This article shows how the disparity can be used to monitor the deterioration in the accuracy of the eye tracker calibration and to automatically invoke a recalibration procedure when necessary. This article also demonstrates how the disparity will vary across screen regions and participants and how each participant's unique error signature can be used to reduce the systematic error in the eye movement data collected for that participant.

Entities:  

Mesh:

Year:  2002        PMID: 12564562     DOI: 10.3758/bf03195487

Source DB:  PubMed          Journal:  Behav Res Methods Instrum Comput        ISSN: 0743-3808


  12 in total

1.  Where people look when watching movies: do all viewers look at the same place?

Authors:  Robert B Goldstein; Russell L Woods; Eli Peli
Journal:  Comput Biol Med       Date:  2006-09-29       Impact factor: 4.589

2.  Typical predictive eye movements during action observation without effector-specific motor simulation.

Authors:  Gilles Vannuscorps; Alfonso Caramazza
Journal:  Psychon Bull Rev       Date:  2017-08

3.  Eye tracking research to answer questions about augmentative and alternative communication assessment and intervention.

Authors:  Krista M Wilkinson; Teresa Mitchell
Journal:  Augment Altern Commun       Date:  2014-04-23       Impact factor: 2.214

4.  SMART-T: a system for novel fully automated anticipatory eye-tracking paradigms.

Authors:  Mohinish Shukla; Johnny Wen; Katherine S White; Richard N Aslin
Journal:  Behav Res Methods       Date:  2011-06

5.  A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation.

Authors:  Miguel A Vadillo; Chris N H Street; Tom Beesley; David R Shanks
Journal:  Behav Res Methods       Date:  2015-12

6.  Saccades to Explicit and Virtual Features in the Poggendorff Figure Show Perceptual Biases.

Authors:  Barbara Dillenburger; Michael Morgan
Journal:  Iperception       Date:  2017-04-21

7.  Enhancing the usability of low-cost eye trackers for rehabilitation applications.

Authors:  Rahul Dasharath Gavas; Sangheeta Roy; Debatri Chatterjee; Soumya Ranjan Tripathy; Kingshuk Chakravarty; Aniruddha Sinha
Journal:  PLoS One       Date:  2018-06-01       Impact factor: 3.240

8.  The Effect of Real-time Headbox Adjustments on Data Quality.

Authors:  Pieter Blignaut
Journal:  J Eye Mov Res       Date:  2018-03-21       Impact factor: 0.957

9.  Study of an Extensive Set of Eye Movement Features: Extraction Methods and Statistical Analysis.

Authors:  Ioannis Rigas; Lee Friedman; Oleg Komogortsev
Journal:  J Eye Mov Res       Date:  2018-03-20       Impact factor: 0.957

10.  Obstacle avoidance, visual detection performance, and eye-scanning behavior of glaucoma patients in a driving simulator: a preliminary study.

Authors:  Rocío Prado Vega; Peter M van Leeuwen; Elizabeth Rendón Vélez; Hans G Lemij; Joost C F de Winter
Journal:  PLoS One       Date:  2013-10-16       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.