Literature DB >> 33828716

Automating Areas of Interest Analysis in Mobile Eye Tracking Experiments based on Machine Learning.

Julian Wolf1, Stephan Hess1, David Bachmann1, Quentin Lohmeyer1, Mirko Meboldt1.   

Abstract

For an in-depth, AOI-based analysis of mobile eye tracking data, a preceding gaze assign-ment step is inevitable. Current solutions such as manual gaze mapping or marker-based approaches are tedious and not suitable for applications manipulating tangible objects. This makes mobile eye tracking studies with several hours of recording difficult to analyse quan-titatively. We introduce a new machine learning-based algorithm, the computational Gaze-Object Mapping (cGOM), that automatically maps gaze data onto respective AOIs. cGOM extends state-of-the-art object detection and segmentation by mask R-CNN with a gaze mapping feature. The new algorithm's performance is validated against a manual fixation-by-fixation mapping, which is considered as ground truth, in terms of true positive rate (TPR), true negative rate (TNR) and efficiency. Using only 72 training images with 264 labelled object representations, cGOM is able to reach a TPR of approx. 80% and a TNR of 85% compared to the manual mapping. The break-even point is reached at 2 hours of eye tracking recording for the total procedure, respectively 1 hour considering human working time only. Together with a real-time capability of the mapping process after completed train-ing, even hours of eye tracking recording can be evaluated efficiently. (Code and video examples have been made available at: https://gitlab.ethz.ch/pdz/cgom.git).

Entities:  

Keywords:  areas of interest; cGOM; gaze mapping; machine learning; mask R-CNN; mobile eye tracking; object detection; tangible objects; usability

Year:  2018        PMID: 33828716      PMCID: PMC7909988          DOI: 10.16910/jemr.11.6.6

Source DB:  PubMed          Journal:  J Eye Mov Res        ISSN: 1995-8692            Impact factor:   0.957


  2 in total

1.  Measuring dwell time percentage from head-mounted eye-tracking data--comparison of a frame-by-frame and a fixation-by-fixation analysis.

Authors:  Pieter Vansteenkiste; Greet Cardon; Renaat Philippaerts; Matthieu Lenoir
Journal:  Ergonomics       Date:  2014-12-20       Impact factor: 2.778

2.  Combining user logging with eye tracking for interactive and dynamic applications.

Authors:  Kristien Ooms; Arzu Coltekin; Philippe De Maeyer; Lien Dupont; Sara Fabrikant; Annelies Incoul; Matthias Kuhn; Hendrik Slabbinck; Pieter Vansteenkiste; Lise Van der Haegen
Journal:  Behav Res Methods       Date:  2015-12
  2 in total
  2 in total

1.  Eye tracking applied to tobacco smoking: current directions and future perspectives.

Authors:  Matteo Valsecchi; Maurizio Codispoti
Journal:  J Eye Mov Res       Date:  2022-01-21       Impact factor: 1.349

2.  An algorithmic approach to determine expertise development using object-related gaze pattern sequences.

Authors:  Felix S Wang; Céline Gianduzzo; Mirko Meboldt; Quentin Lohmeyer
Journal:  Behav Res Methods       Date:  2021-07-13
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.