Literature DB >> 24258321

PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments.

Edwin S Dalmaijer1, Sebastiaan Mathôt, Stefan Van der Stigchel.   

Abstract

The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.

Entities:  

Mesh:

Year:  2014        PMID: 24258321     DOI: 10.3758/s13428-013-0422-2

Source DB:  PubMed          Journal:  Behav Res Methods        ISSN: 1554-351X


  58 in total

1.  EALab (Eye Activity Lab): a MATLAB Toolbox for Variable Extraction, Multivariate Analysis and Classification of Eye-Movement Data.

Authors:  Javier Andreu-Perez; Celine Solnais; Kumuthan Sriskandarajah
Journal:  Neuroinformatics       Date:  2016-01

2.  Eye-movements reveal semantic interference effects during the encoding of naturalistic scenes in long-term memory.

Authors:  Anastasiia Mikhailova; Ana Raposo; Sergio Della Sala; Moreno I Coco
Journal:  Psychon Bull Rev       Date:  2021-05-19

3.  Real-time sharing of gaze data between multiple eye trackers-evaluation, tools, and advice.

Authors:  Marcus Nyström; Diederick C Niehorster; Tim Cornelissen; Henrik Garde
Journal:  Behav Res Methods       Date:  2017-08

4.  Is there a safety-net effect with computer-aided detection?

Authors:  Ethan Du-Crow; Susan M Astley; Johan Hulleman
Journal:  J Med Imaging (Bellingham)       Date:  2019-12-26

5.  Effects of task and task-switching on temporal inhibition of return, facilitation of return, and saccadic momentum during scene viewing.

Authors:  Mark Mills; Edwin S Dalmaijer; Stefan Van der Stigchel; Michael D Dodd
Journal:  J Exp Psychol Hum Percept Perform       Date:  2015-06-15       Impact factor: 3.332

6.  Beta and Theta Oscillations Differentially Support Free Versus Forced Control over Multiple-Target Search.

Authors:  Joram van Driel; Eduard Ort; Johannes J Fahrenfort; Christian N L Olivers
Journal:  J Neurosci       Date:  2019-01-07       Impact factor: 6.167

7.  Integrated Development Environment for EEG-Driven Cognitive-Neuropsychological Research.

Authors:  Shoham Jacobsen; Oded Meiron; David Yoel Salomon; Nir Kraizler; Hagai Factor; Efraim Jaul; Elishai Ezra Tsur
Journal:  IEEE J Transl Eng Health Med       Date:  2020-05-06       Impact factor: 3.316

8.  Distracted by danger: Temporal and spatial dynamics of visual selection in the presence of threat.

Authors:  Manon Mulckhuyse; Edwin S Dalmaijer
Journal:  Cogn Affect Behav Neurosci       Date:  2016-04       Impact factor: 3.282

9.  Measuring attentional bias to food cues in young children using a visual search task: An eye-tracking study.

Authors:  John Brand; Travis D Masterson; Jennifer A Emond; Reina Lansigan; Diane Gilbert-Diamond
Journal:  Appetite       Date:  2020-01-17       Impact factor: 3.868

10.  OpenMATB: A Multi-Attribute Task Battery promoting task customization, software extensibility and experiment replicability.

Authors:  J Cegarra; B Valéry; E Avril; C Calmettes; J Navarro
Journal:  Behav Res Methods       Date:  2020-10
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.