Literature DB >> 25873487

A mixture hierarchical model for response times and response accuracy.

Chun Wang1, Gongjun Xu2.   

Abstract

In real testing, examinees may manifest different types of test-taking behaviours. In this paper we focus on two types that appear to be among the more frequently occurring behaviours – solution behaviour and rapid guessing behaviour. Rapid guessing usually happens in high-stakes tests when there is insufficient time, and in low-stakes tests when there is lack of effort. These two qualitatively different test-taking behaviours, if ignored, will lead to violation of the local independence assumption and, as a result, yield biased item/person parameter estimation. We propose a mixture hierarchical model to account for differences among item responses and response time patterns arising from these two behaviours. The model is also able to identify the specific behaviour an examinee engages in when answering an item. A Monte Carlo expectation maximization algorithm is proposed for model calibration. A simulation study shows that the new model yields more accurate item and person parameter estimates than a non-mixture model when the data indeed come from two types of behaviour. The model also fits real, high-stakes test data better than a non-mixture model, and therefore the new model can better identify the underlying test-taking behaviour an examinee engages in on a certain item.
© 2015 The British Psychological Society.

Entities:  

Keywords:  mixture hierarchical model; rapid guessing; response time

Mesh:

Year:  2015        PMID: 25873487     DOI: 10.1111/bmsp.12054

Source DB:  PubMed          Journal:  Br J Math Stat Psychol        ISSN: 0007-1102            Impact factor:   3.380


  24 in total

1.  Detection of Test Speededness Using Change-Point Analysis.

Authors:  Can Shao; Jun Li; Ying Cheng
Journal:  Psychometrika       Date:  2015-08-25       Impact factor: 2.500

2.  Identifying Effortful Individuals With Mixture Modeling Response Accuracy and Response Time Simultaneously to Improve Item Parameter Estimation.

Authors:  Yue Liu; Ying Cheng; Hongyun Liu
Journal:  Educ Psychol Meas       Date:  2020-01-06       Impact factor: 2.821

3.  A Two-Stage Approach to Differentiating Normal and Aberrant Behavior in Computer Based Testing.

Authors:  Chun Wang; Gongjun Xu; Zhuoran Shang
Journal:  Psychometrika       Date:  2016-10-28       Impact factor: 2.500

4.  A New Online Calibration Method Based on Lord's Bias-Correction.

Authors:  Yinhong He; Ping Chen; Yong Li; Shumei Zhang
Journal:  Appl Psychol Meas       Date:  2017-03-26

5.  IRT Scoring and Test Blueprint Fidelity.

Authors:  Gregory Camilli
Journal:  Appl Psychol Meas       Date:  2018-02-20

6.  Response Mixture Modeling: Accounting for Heterogeneity in Item Characteristics across Response Times.

Authors:  Dylan Molenaar; Paul de Boeck
Journal:  Psychometrika       Date:  2018-02-01       Impact factor: 2.500

7.  Assessing the Accuracy of Parameter Estimates in the Presence of Rapid Guessing Misclassifications.

Authors:  Joseph A Rios
Journal:  Educ Psychol Meas       Date:  2021-04-21       Impact factor: 2.821

8.  Regularized Variational Estimation for Exploratory Item Factor Analysis.

Authors:  April E Cho; Jiaying Xiao; Chun Wang; Gongjun Xu
Journal:  Psychometrika       Date:  2022-07-13       Impact factor: 2.290

9.  Application of Change Point Analysis of Response Time Data to Detect Test Speededness.

Authors:  Ying Cheng; Can Shao
Journal:  Educ Psychol Meas       Date:  2021-09-20       Impact factor: 3.088

10.  Modeling Conditional Dependence of Response Accuracy and Response Time with the Diffusion Item Response Theory Model.

Authors:  Inhan Kang; Paul De Boeck; Roger Ratcliff
Journal:  Psychometrika       Date:  2022-01-06       Impact factor: 2.500

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.