Yuelin Li1, James C Root2, Thomas M Atkinson2, Tim A Ahles2. 1. Department of Psychiatry and Behavioral Sciences, Memorial Sloan Kettering Cancer Center, New York, NY, USA liy12@mskcc.org. 2. Department of Psychiatry and Behavioral Sciences, Memorial Sloan Kettering Cancer Center, New York, NY, USA.
Abstract
OBJECTIVE: Patient-reported cognition generally exhibits poor concordance with objectively assessed cognitive performance. In this article, we introduce latent regression Rasch modeling and provide a step-by-step tutorial for applying Rasch methods as an alternative to traditional correlation to better clarify the relationship of self-report and objective cognitive performance. An example analysis using these methods is also included. METHOD: Introduction to latent regression Rasch modeling is provided together with a tutorial on implementing it using the JAGS programming language for the Bayesian posterior parameter estimates. In an example analysis, data from a longitudinal neurocognitive outcomes study of 132 breast cancer patients and 45 non-cancer matched controls that included self-report and objective performance measures pre- and post-treatment were analyzed using both conventional and latent regression Rasch model approaches. RESULTS: Consistent with previous research, conventional analysis and correlations between neurocognitive decline and self-reported problems were generally near zero. In contrast, application of latent regression Rasch modeling found statistically reliable associations between objective attention and processing speed measures with self-reported Attention and Memory scores. CONCLUSIONS: Latent regression Rasch modeling, together with correlation of specific self-reported cognitive domains with neurocognitive measures, helps to clarify the relationship of self-report with objective performance. While the majority of patients attribute their cognitive difficulties to memory decline, the Rash modeling suggests the importance of processing speed and initial learning. To encourage the use of this method, a step-by-step guide and programming language for implementation is provided. Implications of this method in cognitive outcomes research are discussed.
OBJECTIVE: Patient-reported cognition generally exhibits poor concordance with objectively assessed cognitive performance. In this article, we introduce latent regression Rasch modeling and provide a step-by-step tutorial for applying Rasch methods as an alternative to traditional correlation to better clarify the relationship of self-report and objective cognitive performance. An example analysis using these methods is also included. METHOD: Introduction to latent regression Rasch modeling is provided together with a tutorial on implementing it using the JAGS programming language for the Bayesian posterior parameter estimates. In an example analysis, data from a longitudinal neurocognitive outcomes study of 132 breast cancer patients and 45 non-cancer matched controls that included self-report and objective performance measures pre- and post-treatment were analyzed using both conventional and latent regression Rasch model approaches. RESULTS: Consistent with previous research, conventional analysis and correlations between neurocognitive decline and self-reported problems were generally near zero. In contrast, application of latent regression Rasch modeling found statistically reliable associations between objective attention and processing speed measures with self-reported Attention and Memory scores. CONCLUSIONS: Latent regression Rasch modeling, together with correlation of specific self-reported cognitive domains with neurocognitive measures, helps to clarify the relationship of self-report with objective performance. While the majority of patients attribute their cognitive difficulties to memory decline, the Rash modeling suggests the importance of processing speed and initial learning. To encourage the use of this method, a step-by-step guide and programming language for implementation is provided. Implications of this method in cognitive outcomes research are discussed.
Authors: Sanne B Schagen; Martin J Muller; Willem Boogerd; Gideon J Mellenbergh; Frits S A M van Dam Journal: J Natl Cancer Inst Date: 2006-12-06 Impact factor: 13.506
Authors: S E Marino; K J Meador; D W Loring; M S Okun; H H Fernandez; A J Fessler; R P Kustra; J M Miller; P G Ray; A Roy; M R Schoenberg; V J Vahle; M A Werz Journal: Epilepsy Behav Date: 2009-01-06 Impact factor: 2.937
Authors: Heather S L Jim; Kristin M Phillips; Sari Chait; Leigh Anne Faul; Mihaela A Popa; Yun-Hsiang Lee; Mallory G Hussin; Paul B Jacobsen; Brent J Small Journal: J Clin Oncol Date: 2012-08-27 Impact factor: 44.544
Authors: Tim A Ahles; Andrew J Saykin; Brenna C McDonald; Yuelin Li; Charlotte T Furstenberg; Brett S Hanscom; Tamsin J Mulrooney; Gary N Schwartz; Peter A Kaufman Journal: J Clin Oncol Date: 2010-09-13 Impact factor: 44.544
Authors: Tim A Ahles; Yuelin Li; Brenna C McDonald; Gary N Schwartz; Peter A Kaufman; Gregory J Tsongalis; Jason H Moore; Andrew J Saykin Journal: Psychooncology Date: 2014-04-30 Impact factor: 3.894
Authors: Alexandra M Gaynor; Denise Pergolizzi; Yesne Alici; Elizabeth Ryan; Katrazyna McNeal; Tim A Ahles; James C Root Journal: Brain Stimul Date: 2020-04-27 Impact factor: 8.955
Authors: Michael L Thomas; Gregory G Brown; Ruben C Gur; Tyler M Moore; Virginie M Patt; Victoria B Risbrough; Dewleen G Baker Journal: J Clin Exp Neuropsychol Date: 2018-02-05 Impact factor: 2.475
Authors: Alexandra M Gaynor; Tim A Ahles; Elizabeth Ryan; Elizabeth Schofield; Yuelin Li; Sunita K Patel; Katrazyna McNeal; Tiffany Traina; James C Root Journal: J Cancer Surviv Date: 2021-08-06 Impact factor: 4.062