Literature DB >> 35754615

Polytomous Testlet Response Models for Technology-Enhanced Innovative Items: Implications on Model Fit and Trait Inference.

Hyeon-Ah Kang1, Suhwa Han1, Doyoung Kim2, Shu-Chuan Kao2.   

Abstract

The development of technology-enhanced innovative items calls for practical models that can describe polytomous testlet items. In this study, we evaluate four measurement models that can characterize polytomous items administered in testlets: (a) generalized partial credit model (GPCM), (b) testlet-as-a-polytomous-item model (TPIM), (c) random-effect testlet model (RTM), and (d) fixed-effect testlet model (FTM). Using data from GPCM, FTM, and RTM, we examine performance of the scoring models in multiple aspects: relative model fit, absolute item fit, significance of testlet effects, parameter recovery, and classification accuracy. The empirical analysis suggests that relative performance of the models varies substantially depending on the testlet-effect type, effect size, and trait estimator. When testlets had no or fixed effects, GPCM and FTM led to most desirable measurement outcomes. When testlets had random interaction effects, RTM demonstrated best model fit and yet showed substantially different performance in the trait recovery depending on the estimator. In particular, the advantage of RTM as a scoring model was discernable only when there existed strong random effects and the trait levels were estimated with Bayes priors. In other settings, the simpler models (i.e., GPCM, FTM) performed better or comparably. The study also revealed that polytomous scoring of testlet items has limited prospect as a functional scoring method. Based on the outcomes of the empirical evaluation, we provide practical guidelines for choosing a measurement model for polytomous innovative items that are administered in testlets.
© The Author(s) 2021.

Entities:  

Keywords:  innovative items; item response theory; polytomous items; technology-enhanced assessment; testlet

Year:  2021        PMID: 35754615      PMCID: PMC9228694          DOI: 10.1177/00131644211032261

Source DB:  PubMed          Journal:  Educ Psychol Meas        ISSN: 0013-1644            Impact factor:   3.088


  6 in total

1.  The effect of ignoring item interactions on the estimated discrimination parameters in item response theory.

Authors:  F Tuerlinckx; P De Boeck
Journal:  Psychol Methods       Date:  2001-06

2.  Goodness of Fit in Item Response Models.

Authors:  R P McDonald; M M Mok
Journal:  Multivariate Behav Res       Date:  1995-01-01       Impact factor: 5.923

3.  Effects of varying magnitude and patterns of response dependence in the unidimensional Rasch model.

Authors:  Ida Marais; David Andrich
Journal:  J Appl Meas       Date:  2008

4.  A Unidimensional Latent Trait Model for Continuous Item Responses.

Authors:  G J Mellenbergh
Journal:  Multivariate Behav Res       Date:  1994-07-01       Impact factor: 5.923

5.  A Two-Level Alternating Direction Model for Polytomous Items With Local Dependence.

Authors:  Igor Himelfarb; Katerina M Marcoulides; Guoliang Fang; Bruce L Shotts
Journal:  Educ Psychol Meas       Date:  2019-09-03       Impact factor: 2.821

6.  Evaluating Different Scoring Methods for Multiple Response Items Providing Partial Credit.

Authors:  Joe Betts; William Muntean; Doyoung Kim; Shu-Chuan Kao
Journal:  Educ Psychol Meas       Date:  2021-02-22       Impact factor: 2.821

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.