Heleen E M Feenstra1, Jaap M J Murre2, Ivar E Vermeulen3, Jacobien M Kieffer1, Sanne B Schagen1,2. 1. a Division of Psychosocial Research and Epidemiology, The Netherlands Cancer Institute , Amsterdam , The Netherlands. 2. b Department of Psychology , University of Amsterdam , Amsterdam , The Netherlands. 3. c Department of Communication Science , VU University Amsterdam , Amsterdam , The Netherlands.
Abstract
INTRODUCTION: To facilitate large-scale assessment of a variety of cognitive abilities in clinical studies, we developed a self-administered online neuropsychological test battery: the Amsterdam Cognition Scan (ACS). The current studies evaluate in a group of adult cancer patients: test-retest reliability of the ACS and the influence of test setting (home or hospital), and the relationship between our online and a traditional test battery (concurrent validity). METHOD: Test-retest reliability was studied in 96 cancer patients (57 female; Mage = 51.8 years) who completed the ACS twice. Intraclass correlation coefficients (ICCs) were used to assess consistency over time. The test setting was counterbalanced between home and hospital; influence on test performance was assessed by repeated measures analyses of variance. Concurrent validity was studied in 201 cancer patients (112 female; Mage = 53.5 years) who completed both the online and an equivalent traditional neuropsychological test battery. Spearman or Pearson correlations were used to assess consistency between online and traditional tests. RESULTS: ICCs of the online tests ranged from .29 to .76, with an ICC of .78 for the ACS total score. These correlations are generally comparable with the test-retest correlations of the traditional tests as reported in the literature. Correlating online and traditional test scores, we observed medium to large concurrent validity (r/ρ = .42 to .70; total score r = .78), except for a visuospatial memory test (ρ = .36). Correlations were affected-as expected-by design differences between online tests and their offline counterparts. CONCLUSIONS: Although development and optimization of the ACS is an ongoing process, and reliability can be optimized for several tests, our results indicate that it is a highly usable tool to obtain (online) measures of various cognitive abilities. The ACS is expected to facilitate efficient gathering of data on cognitive functioning in the near future.
RCT Entities:
INTRODUCTION: To facilitate large-scale assessment of a variety of cognitive abilities in clinical studies, we developed a self-administered online neuropsychological test battery: the Amsterdam Cognition Scan (ACS). The current studies evaluate in a group of adult cancerpatients: test-retest reliability of the ACS and the influence of test setting (home or hospital), and the relationship between our online and a traditional test battery (concurrent validity). METHOD: Test-retest reliability was studied in 96 cancerpatients (57 female; Mage = 51.8 years) who completed the ACS twice. Intraclass correlation coefficients (ICCs) were used to assess consistency over time. The test setting was counterbalanced between home and hospital; influence on test performance was assessed by repeated measures analyses of variance. Concurrent validity was studied in 201 cancerpatients (112 female; Mage = 53.5 years) who completed both the online and an equivalent traditional neuropsychological test battery. Spearman or Pearson correlations were used to assess consistency between online and traditional tests. RESULTS: ICCs of the online tests ranged from .29 to .76, with an ICC of .78 for the ACS total score. These correlations are generally comparable with the test-retest correlations of the traditional tests as reported in the literature. Correlating online and traditional test scores, we observed medium to large concurrent validity (r/ρ = .42 to .70; total score r = .78), except for a visuospatial memory test (ρ = .36). Correlations were affected-as expected-by design differences between online tests and their offline counterparts. CONCLUSIONS: Although development and optimization of the ACS is an ongoing process, and reliability can be optimized for several tests, our results indicate that it is a highly usable tool to obtain (online) measures of various cognitive abilities. The ACS is expected to facilitate efficient gathering of data on cognitive functioning in the near future.
Authors: Alexandra M Gaynor; Anam Ahsan; Duane Jung; Elizabeth Schofield; Yuelin Li; Elizabeth Ryan; Tim A Ahles; James C Root Journal: J Cancer Surviv Date: 2022-08-08 Impact factor: 4.062
Authors: Claudi Bockting; Amanda M Legemaat; Johanne G J van der Stappen; Gert J Geurtsen; Maria Semkovska; Huibert Burger; Isidoor O Bergfeld; Nicoline Lous; Damiaan A J P Denys; Marlies Brouwer Journal: BMJ Open Date: 2022-06-23 Impact factor: 3.006
Authors: Joost A Agelink van Rentergem; Ivar E Vermeulen; Philippe R Lee Meeuw Kjoe; Sanne B Schagen Journal: J Natl Cancer Inst Date: 2021-01-04 Impact factor: 13.506
Authors: Katherine E Dorociak; Nora Mattek; Jonathan Lee; Mira I Leese; Nicole Bouranis; Danish Imtiaz; Bridget M Doane; John P K Bernstein; Jeffrey A Kaye; Adriana M Hughes Journal: Gerontology Date: 2021-04-07 Impact factor: 5.140
Authors: Lenja Witlox; Sanne B Schagen; Michiel B de Ruiter; Mirjam I Geerlings; Petra H M Peeters; Emmie W Koevoets; Elsken van der Wall; Martijn Stuiver; Gabe Sonke; Miranda J Velthuis; Job A M van der Palen; Jan J Jobsen; Anne M May; E M Monninkhof Journal: BMJ Open Date: 2019-06-20 Impact factor: 2.692
Authors: Jennifer Ferrar; Gareth J Griffith; Caroline Skirrow; Nathan Cashdollar; Nick Taptiklis; James Dobson; Fiona Cree; Francesca K Cormack; Jennifer H Barnett; Marcus R Munafò Journal: J Med Internet Res Date: 2021-06-18 Impact factor: 5.428
Authors: Kete M Klaver; Saskia F A Duijts; Chantal A V Geusgens; Maureen J B Aarts; Rudolf W H M Ponds; Allard J van der Beek; Sanne B Schagen Journal: Trials Date: 2020-07-20 Impact factor: 2.279