Literature DB >> 20884493

Variability of eye movements when viewing dynamic natural scenes.

Michael Dorr1, Thomas Martinetz, Karl R Gegenfurtner, Erhardt Barth.   

Abstract

How similar are the eye movement patterns of different subjects when free viewing dynamic natural scenes? We collected a large database of eye movements from 54 subjects on 18 high-resolution videos of outdoor scenes and measured their variability using the Normalized Scanpath Saliency, which we extended to the temporal domain. Even though up to about 80% of subjects looked at the same image region in some video parts, variability usually was much greater. Eye movements on natural movies were then compared with eye movements in several control conditions. "Stop-motion" movies had almost identical semantic content as the original videos but lacked continuous motion. Hollywood action movie trailers were used to probe the upper limit of eye movement coherence that can be achieved by deliberate camera work, scene cuts, etc. In a "repetitive" condition, subjects viewed the same movies ten times each over the course of 2 days. Results show several systematic differences between conditions both for general eye movement parameters such as saccade amplitude and fixation duration and for eye movement variability. Most importantly, eye movements on static images are initially driven by stimulus onset effects and later, more so than on continuous videos, by subject-specific idiosyncrasies; eye movements on Hollywood movies are significantly more coherent than those on natural movies. We conclude that the stimuli types often used in laboratory experiments, static images and professionally cut material, are not very representative of natural viewing behavior. All stimuli and gaze data are publicly available at http://www.inb.uni-luebeck.de/tools-demos/gaze.

Mesh:

Year:  2010        PMID: 20884493     DOI: 10.1167/10.10.28

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  85 in total

1.  Semantic congruency but not temporal synchrony enhances long-term memory performance for audio-visual scenes.

Authors:  Hauke S Meyerhoff; Markus Huff
Journal:  Mem Cognit       Date:  2016-04

2.  REMoDNaV: robust eye-movement classification for dynamic stimulation.

Authors:  Asim H Dar; Adina S Wagner; Michael Hanke
Journal:  Behav Res Methods       Date:  2021-02

3.  Age differences in online processing of video: an eye movement study.

Authors:  Heather L Kirkorian; Daniel R Anderson; Rachel Keen
Journal:  Child Dev       Date:  2012-01-30

4.  Sensitivity to gaze-contingent contrast increments in naturalistic movies: An exploratory report and model comparison.

Authors:  Thomas S A Wallis; Michael Dorr; Peter J Bex
Journal:  J Vis       Date:  2015       Impact factor: 2.240

5.  Synchronized eye movements predict test scores in online video education.

Authors:  Jens Madsen; Sara U Júlio; Pawel J Gucik; Richard Steinberg; Lucas C Parra
Journal:  Proc Natl Acad Sci U S A       Date:  2021-02-02       Impact factor: 11.205

6.  Effect of sequential video shot comprehensibility on attentional synchrony: A comparison of children and adults.

Authors:  Heather L Kirkorian; Daniel R Anderson
Journal:  Proc Natl Acad Sci U S A       Date:  2018-10-02       Impact factor: 11.205

7.  A high-resolution 7-Tesla fMRI dataset from complex natural stimulation with an audio movie.

Authors:  Michael Hanke; Florian J Baumgartner; Pierre Ibe; Falko R Kaule; Stefan Pollmann; Oliver Speck; Wolf Zinke; Jörg Stadler
Journal:  Sci Data       Date:  2014-05-27       Impact factor: 6.444

8.  Eye movements while viewing narrated, captioned, and silent videos.

Authors:  Nicholas M Ross; Eileen Kowler
Journal:  J Vis       Date:  2013-03-01       Impact factor: 2.240

9.  Peri-saccadic natural vision.

Authors:  Michael Dorr; Peter J Bex
Journal:  J Neurosci       Date:  2013-01-16       Impact factor: 6.167

10.  Eye movement prediction and variability on natural video data sets.

Authors:  Michael Dorr; Eleonora Vig; Erhardt Barth
Journal:  Vis cogn       Date:  2012-03-26
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.