Literature DB >> 30334148

gazeNet: End-to-end eye-movement event detection with deep neural networks.

Raimondas Zemblys1, Diederick C Niehorster2, Kenneth Holmqvist3,4,5.   

Abstract

Existing event detection algorithms for eye-movement data almost exclusively rely on thresholding one or more hand-crafted signal features, each computed from the stream of raw gaze data. Moreover, this thresholding is largely left for the end user. Here we present and develop gazeNet, a new framework for creating event detectors that do not require hand-crafted signal features or signal thresholding. It employs an end-to-end deep learning approach, which takes raw eye-tracking data as input and classifies it into fixations, saccades and post-saccadic oscillations. Our method thereby challenges an established tacit assumption that hand-crafted features are necessary in the design of event detection algorithms. The downside of the deep learning approach is that a large amount of training data is required. We therefore first develop a method to augment hand-coded data, so that we can strongly enlarge the data set used for training, minimizing the time spent on manual coding. Using this extended hand-coded data, we train a neural network that produces eye-movement event classification from raw eye-movement data without requiring any predefined feature extraction or post-processing steps. The resulting classification performance is at the level of expert human coders. Moreover, an evaluation of gazeNet on two other datasets showed that gazeNet generalized to data from different eye trackers and consistently outperformed several other event detection algorithms that we tested.

Entities:  

Keywords:  Deep learning; Event detection; Eye movements; Fixation; PSO; Saccade

Mesh:

Year:  2019        PMID: 30334148     DOI: 10.3758/s13428-018-1133-5

Source DB:  PubMed          Journal:  Behav Res Methods        ISSN: 1554-351X


  4 in total

1.  Evaluating Eye Movement Event Detection: A Review of the State of the Art.

Authors:  Mikhail Startsev; Raimondas Zemblys
Journal:  Behav Res Methods       Date:  2022-06-17

2.  Eye tracking: empirical foundations for a minimal reporting guideline.

Authors:  Kenneth Holmqvist; Saga Lee Örbom; Ignace T C Hooge; Diederick C Niehorster; Robert G Alexander; Richard Andersson; Jeroen S Benjamins; Pieter Blignaut; Anne-Marie Brouwer; Lewis L Chuang; Kirsten A Dalrymple; Denis Drieghe; Matt J Dunn; Ulrich Ettinger; Susann Fiedler; Tom Foulsham; Jos N van der Geest; Dan Witzner Hansen; Samuel B Hutton; Enkelejda Kasneci; Alan Kingstone; Paul C Knox; Ellen M Kok; Helena Lee; Joy Yeonjoo Lee; Jukka M Leppänen; Stephen Macknik; Päivi Majaranta; Susana Martinez-Conde; Antje Nuthmann; Marcus Nyström; Jacob L Orquin; Jorge Otero-Millan; Soon Young Park; Stanislav Popelka; Frank Proudlock; Frank Renkewitz; Austin Roorda; Michael Schulte-Mecklenbeck; Bonita Sharif; Frederick Shic; Mark Shovman; Mervyn G Thomas; Ward Venrooij; Raimondas Zemblys; Roy S Hessels
Journal:  Behav Res Methods       Date:  2022-04-06

3.  Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data.

Authors:  Diederick C Niehorster; Raimondas Zemblys; Tanya Beelders; Kenneth Holmqvist
Journal:  Behav Res Methods       Date:  2020-12

4.  Small eye movements cannot be reliably measured by video-based P-CR eye-trackers.

Authors:  Kenneth Holmqvist; Pieter Blignaut
Journal:  Behav Res Methods       Date:  2020-10
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.