| Literature DB >> 33420895 |
Kazuhiro Yamaguchi1,2, Kensuke Okada3.
Abstract
Saturated diagnostic classification models (DCM) can flexibly accommodate various relationships among attributes to diagnose individual attribute mastery, and include various important DCMs as sub-models. However, the existing formulations of the saturated DCM are not better suited for deriving conditionally conjugate priors of model parameters. Because their derivation is the key in developing a variational Bayes (VB) inference algorithm, in the present study, we proposed a novel mixture formulation of saturated DCM. Based on it, we developed a VB inference algorithm of the saturated DCM that enables us to perform scalable and computationally efficient Bayesian estimation. The simulation study indicated that the proposed algorithm could recover the parameters in various conditions. It has also been demonstrated that the proposed approach is particularly suited to the case when new data become sequentially available over time, such as in computerized diagnostic testing. In addition, a real educational dataset was comparatively analyzed with the proposed VB and Markov chain Monte Carlo (MCMC) algorithms. The result demonstrated that very similar estimates were obtained between the two methods and that the proposed VB inference was much faster than MCMC. The proposed method can be a practical solution to the problem of computational load.Entities:
Keywords: cognitive diagnostic models; diagnostic classification models; saturated model; variational Bayes inference
Year: 2021 PMID: 33420895 DOI: 10.1007/s11336-020-09739-w
Source DB: PubMed Journal: Psychometrika ISSN: 0033-3123 Impact factor: 2.500