OBJECTIVE: Computerized tests have increasingly garnered interest for assessing cognitive functioning due to their potential logistical and financial advantages over traditional 'pencil and-paper' neuropsychological tests. However, psychometric information is necessary to guide decisions about their clinical and research utility with varied populations. We explored the convergent construct validity and criterion validity of the CogState computerized tests in breast cancer survivors, a group known to present with mostly mild, subtle cognitive dysfunction. METHOD: Fifty-three post-menopausal women (26 breast cancer survivors, 27 healthy controls) completed the CogState Brief Battery tests with passed performance checks, conceptually matched traditional neuropsychological tests, and a self-report measure of daily functioning, the Functional Activities Questionnaire. RESULTS: Significant positive correlations were found between the CogState Brief Battery tests and traditional neuropsychological tests, although the traditional tests specifically hypothesized to correlate with CogState tests did not reach statistical significance. Analysis of Covariance results showed preliminary support for criterion validity, as the patient and control groups differed on the traditional test of working memory (Digits Backwards, p = .01), with a trend towards significance for the CogState test of working memory (One Back, p = .02), controlled for age, race, and mood. CONCLUSIONS: The results provide preliminary support for further research to determine if the CogState tests are viable as screening tools to detect subtle cognitive differences between breast cancer survivors and healthy women. Our study was limited by the low base rate of cognitive impairment and small sample size. We recommend further research employing sufficiently powered sample sizes and a longitudinal, repeated measures study design.
OBJECTIVE: Computerized tests have increasingly garnered interest for assessing cognitive functioning due to their potential logistical and financial advantages over traditional 'pencil and-paper' neuropsychological tests. However, psychometric information is necessary to guide decisions about their clinical and research utility with varied populations. We explored the convergent construct validity and criterion validity of the CogState computerized tests in breast cancer survivors, a group known to present with mostly mild, subtle cognitive dysfunction. METHOD: Fifty-three post-menopausal women (26 breast cancer survivors, 27 healthy controls) completed the CogState Brief Battery tests with passed performance checks, conceptually matched traditional neuropsychological tests, and a self-report measure of daily functioning, the Functional Activities Questionnaire. RESULTS: Significant positive correlations were found between the CogState Brief Battery tests and traditional neuropsychological tests, although the traditional tests specifically hypothesized to correlate with CogState tests did not reach statistical significance. Analysis of Covariance results showed preliminary support for criterion validity, as the patient and control groups differed on the traditional test of working memory (Digits Backwards, p = .01), with a trend towards significance for the CogState test of working memory (One Back, p = .02), controlled for age, race, and mood. CONCLUSIONS: The results provide preliminary support for further research to determine if the CogState tests are viable as screening tools to detect subtle cognitive differences between breast cancer survivors and healthy women. Our study was limited by the low base rate of cognitive impairment and small sample size. We recommend further research employing sufficiently powered sample sizes and a longitudinal, repeated measures study design.
Entities:
Keywords:
CogState; Neurocognitive testing; breast cancer; computerized cognitive testing; validity
Authors: A Winston; R Puls; S J Kerr; C Duncombe; P C K Li; J M Gill; S D Taylor-Robinson; S Emery; D A Cooper Journal: HIV Med Date: 2011-12-12 Impact factor: 3.180
Authors: Philip D Harvey; Cynthia O Siu; Jay Hsu; Josephine Cucchiaro; Paul Maruff; Antony Loebel Journal: Eur Neuropsychopharmacol Date: 2013-08-27 Impact factor: 4.600
Authors: Chip Caine; Snehal Deshmukh; Vinai Gondi; Minesh Mehta; Wolfgang Tomé; Benjamin W Corn; Andrew Kanner; Howard Rowley; Vijayananda Kundapur; Albert DeNittis; Jeffrey Noah Greenspoon; Andre A Konski; Glenn S Bauman; Adam Raben; Wenyin Shi; Merideth Wendland; Lisa Kachnic Journal: J Neurooncol Date: 2015-10-28 Impact factor: 4.130
Authors: Richard A Grove; Conn M Harrington; Andreas Mahler; Isabel Beresford; Paul Maruff; Martin T Lowy; Andrew P Nicholls; Rebecca L Boardley; Alienor C Berges; Pradeep J Nathan; Joseph P Horrigan Journal: Curr Alzheimer Res Date: 2014-01 Impact factor: 3.498
Authors: Robert B Noll; Sunita K Patel; Leanne Embry; Kristina K Hardy; Wendy Pelletier; Robert D Annett; Andrea Patenaude; E Anne Lown; Stephen A Sands; Lamia P Barakat Journal: Pediatr Blood Cancer Date: 2012-12-19 Impact factor: 3.167
Authors: Janette Vardy; Karen Wong; Qi-Long Yi; Alison Park; Paul Maruff; Lynne Wagner; Ian F Tannock Journal: Support Care Cancer Date: 2006-03-15 Impact factor: 3.359
Authors: Alexandra M Gaynor; Anam Ahsan; Duane Jung; Elizabeth Schofield; Yuelin Li; Elizabeth Ryan; Tim A Ahles; James C Root Journal: J Cancer Surviv Date: 2022-08-08 Impact factor: 4.062
Authors: Niamh Liana Mundell; Patrick J Owen; Jack Dalla Via; Helen Macpherson; Robin Daly; Patricia M Livingston; Timo Rantalainen; Stephen Foulkes; Jerremy Millar; Declan G Murphy; Steve Fraser Journal: BMJ Open Date: 2022-06-24 Impact factor: 3.006
Authors: Eiman Y Ibrahim; Saira Munshani; Ilaria Domenicano; Rozalyn Rodwin; Richard J Nowak; Lajos Pusztai; Maryam Lustberg; Barbara E Ehrlich Journal: PLoS One Date: 2022-10-07 Impact factor: 3.752
Authors: Michelle Olaithe; Melissa Ree; Nigel McArdle; Sara Donaldson; Maria Pushpanathan; Peter R Eastwood; Romola S Bucks Journal: Front Psychiatry Date: 2021-07-19 Impact factor: 4.157