Literature DB >> 30720394

Regression-based formulas for predicting change in memory test scores in healthy older adults: Comparing use of raw versus standardized scores.

January Durant1, Kevin Duff2, Justin B Miller1.   

Abstract

INTRODUCTION: Standardized regression based (SRB) methods can be used to determine whether meaningful changes in performance on cognitive assessments occur over time. Both raw and standardized scores have been used in SRB models but it is unclear which score metric is most appropriate for predicting follow-up performance. The aim of the present study was to examine differences in SRB prediction formulas using raw versus standard scores on two memory tests commonly used in assessment of older adults.
METHOD: The sample consisted of 135 healthy older adults who underwent baseline and 1-year follow-up neuropsychological assessment including the Hopkins Verbal Learning Test-Revised and Brief Visuospatial Memory Test-Revised. Regression models were fit to predict Time 2 scores from Time 1 scores and demographic variables. Separate models were fit using raw scores and standardized scores. Akaike's information criterion (AIC) was used to determine whether models using raw or standardized scores resulted in best fit. Pearson correlation and intraclass correlation coefficients were calculated between observed and predicted scores. Mean differences between observed and predicted scores were examined using pairwise t tests. To investigate whether a similar pattern of results would be evident using prediction formulas for nonmemory tests, all analyses were also conducted for nonmemory tests.
RESULTS: All regression models were significant, and R2 values for memory test raw score models were larger than those generated by standardized score models. Memory test raw score models were also a better fit based on smaller AIC values. For nonmemory tests, raw score models did not consistently outperform standardized score models. All correlations between observed and predicted Time 2 scores were significant, and none of the predicted scores significantly differed from their respective observed score.
CONCLUSION: For each memory measure, raw score models outperformed standardized score models. For nonmemory tests, neither score metric model consistently outperformed the other.

Entities:  

Keywords:  Memory; dementia; predicting cognition; psychometric change; reliable change

Mesh:

Year:  2019        PMID: 30720394      PMCID: PMC7099613          DOI: 10.1080/13803395.2019.1571169

Source DB:  PubMed          Journal:  J Clin Exp Neuropsychol        ISSN: 1380-3395            Impact factor:   2.475


  16 in total

1.  Determining reliable cognitive change after epilepsy surgery: development of reliable change indices and standardized regression-based change norms for the WMS-III and WAIS-III.

Authors:  Roy Martin; Stephen Sawrie; Frank Gilliam; Melissa Mackey; Edward Faught; Robert Knowlton; Ruben Kuzniekcy
Journal:  Epilepsia       Date:  2002-12       Impact factor: 5.864

2.  Normative data stratified by age and education for two measures of verbal fluency: FAS and animal naming.

Authors:  T N Tombaugh; J Kozak; L Rees
Journal:  Arch Clin Neuropsychol       Date:  1999-02       Impact factor: 2.813

Review 3.  Evidence-based indicators of neuropsychological change in the individual patient: relevant concepts and methods.

Authors:  Kevin Duff
Journal:  Arch Clin Neuropsychol       Date:  2012-02-29       Impact factor: 2.813

4.  Regression-based formulas for predicting change in RBANS subtests with older adults.

Authors:  Kevin Duff; Mike R Schoenberg; Doyle Patton; Jane S Paulsen; John D Bayless; James Mold; James G Scott; Russell L Adams
Journal:  Arch Clin Neuropsychol       Date:  2005-05       Impact factor: 2.813

Review 5.  The robust reliability of neuropsychological measures: meta-analyses of test-retest correlations.

Authors:  Matthew Calamia; Kristian Markon; Daniel Tranel
Journal:  Clin Neuropsychol       Date:  2013-06-25       Impact factor: 3.535

6.  Reliable change indices and standardized regression-based change score norms for evaluating neuropsychological change in children with epilepsy.

Authors:  Robyn M Busch; Tara T Lineweaver; Lisa Ferguson; Jennifer S Haut
Journal:  Epilepsy Behav       Date:  2015-05-28       Impact factor: 2.937

7.  Empirical techniques for determining the reliability, magnitude, and pattern of neuropsychological change after epilepsy surgery.

Authors:  B P Hermann; M Seidenberg; J Schoenfeld; J Peterson; C Leveroni; A R Wyler
Journal:  Epilepsia       Date:  1996-10       Impact factor: 5.864

8.  Reliability, practice effects, and change indices for Rao's Brief Repeatable Battery.

Authors:  Emilio Portaccio; Benedetta Goretti; Valentina Zipoli; Alfonso Iudice; Dario Della Pina; Gian Michele Malentacchi; Simonetta Sabatini; Pasquale Annunziata; Mario Falcini; Monica Mazzoni; Maria Pia Amato
Journal:  Mult Scler       Date:  2010-03-05       Impact factor: 6.312

9.  Measuring neuropsychological change following breast cancer treatment: an analysis of statistical models.

Authors:  L A Ouimet; A Stewart; B Collins; D Schindler; C Bielajew
Journal:  J Clin Exp Neuropsychol       Date:  2008-06-03       Impact factor: 2.475

10.  Cognitive decline in Parkinson's disease: a prospective longitudinal study.

Authors:  Dino Muslimović; Bart Post; Johannes D Speelman; Rob J De Haan; Ben Schmand
Journal:  J Int Neuropsychol Soc       Date:  2009-05       Impact factor: 2.892

View more
  1 in total

1.  Assessing and validating reliable change across ADNI protocols.

Authors:  Dustin B Hammers; Ralitsa Kostadinova; Frederick W Unverzagt; Liana G Apostolova
Journal:  J Clin Exp Neuropsychol       Date:  2022-07-04       Impact factor: 2.283

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.