Scott Morris1, Mike Bass2, Mirinae Lee1, Richard E Neapolitan3. 1. Department of Psychology, Illinois Institute of Technology, Chicago, IL, USA. 2. Department of Medical Social Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA. 3. Department of Preventive Medicine, Northwestern University Feinberg School of Medicine.
Abstract
OBJECTIVE: The Patient Reported Outcomes Measurement Information System (PROMIS) initiative developed an array of patient reported outcome (PRO) measures. To reduce the number of questions administered, PROMIS utilizes unidimensional item response theory and unidimensional computer adaptive testing (UCAT), which means a separate set of questions is administered for each measured trait. Multidimensional item response theory (MIRT) and multidimensional computer adaptive testing (MCAT) simultaneously assess correlated traits. The objective was to investigate the extent to which MCAT reduces patient burden relative to UCAT in the case of PROs. METHODS: One MIRT and 3 unidimensional item response theory models were developed using the related traits anxiety, depression, and anger. Using these models, MCAT and UCAT performance was compared with simulated individuals. RESULTS: Surprisingly, the root mean squared error for both methods increased with the number of items. These results were driven by large errors for individuals with low trait levels. A second analysis focused on individuals aligned with item content. For these individuals, both MCAT and UCAT accuracies improved with additional items. Furthermore, MCAT reduced the test length by 50%. DISCUSSION: For the PROMIS Emotional Distress banks, neither UCAT nor MCAT provided accurate estimates for individuals at low trait levels. Because the items in these banks were designed to detect clinical levels of distress, there is little information for individuals with low trait values. However, trait estimates for individuals targeted by the banks were accurate and MCAT asked substantially fewer questions. CONCLUSION: By reducing the number of items administered, MCAT can allow clinicians and researchers to assess a wider range of PROs with less patient burden.
OBJECTIVE: The Patient Reported Outcomes Measurement Information System (PROMIS) initiative developed an array of patient reported outcome (PRO) measures. To reduce the number of questions administered, PROMIS utilizes unidimensional item response theory and unidimensional computer adaptive testing (UCAT), which means a separate set of questions is administered for each measured trait. Multidimensional item response theory (MIRT) and multidimensional computer adaptive testing (MCAT) simultaneously assess correlated traits. The objective was to investigate the extent to which MCAT reduces patient burden relative to UCAT in the case of PROs. METHODS: One MIRT and 3 unidimensional item response theory models were developed using the related traits anxiety, depression, and anger. Using these models, MCAT and UCAT performance was compared with simulated individuals. RESULTS: Surprisingly, the root mean squared error for both methods increased with the number of items. These results were driven by large errors for individuals with low trait levels. A second analysis focused on individuals aligned with item content. For these individuals, both MCAT and UCAT accuracies improved with additional items. Furthermore, MCAT reduced the test length by 50%. DISCUSSION: For the PROMIS Emotional Distress banks, neither UCAT nor MCAT provided accurate estimates for individuals at low trait levels. Because the items in these banks were designed to detect clinical levels of distress, there is little information for individuals with low trait values. However, trait estimates for individuals targeted by the banks were accurate and MCAT asked substantially fewer questions. CONCLUSION: By reducing the number of items administered, MCAT can allow clinicians and researchers to assess a wider range of PROs with less patient burden.
Authors: Lynne I Wagner; Julian Schink; Michael Bass; Shalini Patel; Maria Varela Diaz; Nan Rothrock; Timothy Pearman; Richard Gershon; Frank J Penedo; Steven Rosen; David Cella Journal: Cancer Date: 2014-11-06 Impact factor: 6.860
Authors: Robert D Gibbons; David J Weiss; David J Kupfer; Ellen Frank; Andrea Fagiolini; Victoria J Grochocinski; Dulal K Bhaumik; Angela Stover; R Darrell Bock; Jason C Immekus Journal: Psychiatr Serv Date: 2008-04 Impact factor: 3.084
Authors: Robert D Gibbons; David J Weiss; Paul A Pilkonis; Ellen Frank; Tara Moore; Jong Bae Kim; David J Kupfer Journal: Arch Gen Psychiatry Date: 2012-11
Authors: Christina J Hajewski; Jacqueline E Baron; Natalie A Glass; Kyle R Duchman; Matthew Bollier; Brian R Wolf; Robert W Westermann Journal: Orthop J Sports Med Date: 2020-04-21
Authors: Linda J Resnik; Mathew L Borgia; Melissa A Clark; Emily Graczyk; Jacob Segil; Pengsheng Ni Journal: PLoS One Date: 2021-12-28 Impact factor: 3.752