Literature DB >> 32086751

Modeling Intensive Polytomous Time-Series Eye-Tracking Data: A Dynamic Tree-Based Item Response Model.

Sun-Joo Cho1, Sarah Brown-Schmidt2, Paul De Boeck3,4, Jianhong Shen2.   

Abstract

This paper presents a dynamic tree-based item response (IRTree) model as a novel extension of the autoregressive generalized linear mixed effect model (dynamic GLMM). We illustrate the unique utility of the dynamic IRTree model in its capability of modeling differentiated processes indicated by intensive polytomous time-series eye-tracking data. The dynamic IRTree was inspired by but is distinct from the dynamic GLMM which was previously presented by Cho, Brown-Schmidt, and Lee (Psychometrika 83(3):751-771, 2018). Unlike the dynamic IRTree, the dynamic GLMM is suitable for modeling intensive binary time-series eye-tracking data to identify visual attention to a single interest area over all other possible fixation locations. The dynamic IRTree model is a general modeling framework which can be used to model change processes (trend and autocorrelation) and which allows for decomposing data into various sources of heterogeneity. The dynamic IRTree model was illustrated using an experimental study that employed the visual-world eye-tracking technique. The results of a simulation study showed that parameter recovery of the model was satisfactory and that ignoring trend and autoregressive effects resulted in biased estimates of experimental condition effects in the same conditions found in the empirical study.

Entities:  

Keywords:  autocorrelation; eye-tracking data; generalized linear mixed effect model; intensive polytomous time series; multinomial processing tree; tree-based item response model; trend

Mesh:

Year:  2020        PMID: 32086751     DOI: 10.1007/s11336-020-09694-6

Source DB:  PubMed          Journal:  Psychometrika        ISSN: 0033-3123            Impact factor:   2.500


  30 in total

1.  Modeling multiple response processes in judgment and choice.

Authors:  Ulf Böckenholt
Journal:  Psychol Methods       Date:  2012-04-30

2.  Addressees distinguish shared from private information when interpreting questions during interactive conversation.

Authors:  Sarah Brown-Schmidt; Christine Gunlogson; Michael K Tanenhaus
Journal:  Cognition       Date:  2007-12-31

3.  A Note on N in Bayesian Information Criterion for Item Response Models.

Authors:  Sun-Joo Cho; Paul De Boeck
Journal:  Appl Psychol Meas       Date:  2017-10-31

4.  Pragmatic expectations and linguistic evidence: Listeners anticipate but do not integrate common ground.

Authors:  Dale J Barr
Journal:  Cognition       Date:  2008-08-28

5.  Measuring response styles in Likert items.

Authors:  Ulf Böckenholt
Journal:  Psychol Methods       Date:  2016-11-28

6.  Changing dynamics: Time-varying autoregressive models using generalized additive modeling.

Authors:  Laura F Bringmann; Ellen L Hamaker; Daniel E Vigo; André Aubert; Denny Borsboom; Francis Tuerlinckx
Journal:  Psychol Methods       Date:  2016-09-26

7.  Incremental interpretation at verbs: restricting the domain of subsequent reference.

Authors:  G T Altmann; Y Kamide
Journal:  Cognition       Date:  1999-12-17

8.  Binary Time Series Modeling with Application to Adhesion Frequency Experiments.

Authors:  Ying Hung; Veronika Zarnitsyna; Yan Zhang; Cheng Zhu; C F Jeff Wu
Journal:  J Am Stat Assoc       Date:  2008-09-01       Impact factor: 5.033

9.  Partner-specific interpretation of maintained referential precedents during interactive dialog.

Authors:  Sarah Brown-Schmidt
Journal:  J Mem Lang       Date:  2009-08-01       Impact factor: 3.059

10.  To center or not to center? Investigating inertia with a multilevel autoregressive model.

Authors:  Ellen L Hamaker; Raoul P P P Grasman
Journal:  Front Psychol       Date:  2015-01-06
View more
  1 in total

1.  Modeling Eye Movements During Decision Making: A Review.

Authors:  Michel Wedel; Rik Pieters; Ralf van der Lans
Journal:  Psychometrika       Date:  2022-07-19       Impact factor: 2.290

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.