Lindsay D Nelson1, Adam Y Pfaller2, Lisa E Rein3, Michael A McCrea2. 1. Department of Neurosurgery, Medical College of Wisconsin, Milwaukee, Wisconsin, USA linelson@mcw.edu. 2. Department of Neurosurgery, Medical College of Wisconsin, Milwaukee, Wisconsin, USA. 3. Department of Biostatistics, Medical College of Wisconsin, Milwaukee, Wisconsin, USA.
Abstract
BACKGROUND: Preseason baseline testing using computerized neurocognitive tests (CNTs) is increasingly performed on athletes. Adequate effort is critical to establish valid estimates of ability, but many users do not evaluate performance validity, and the conditions that affect validity are not well understood across the available CNTs. PURPOSE: To examine the rates and predictors of invalid baseline performance for 3 popular CNTs: Automated Neuropsychological Assessment Metrics (ANAM), Axon Sports, and Immediate Post-Concussion and Cognitive Testing (ImPACT). STUDY DESIGN: Controlled laboratory study. METHODS: High school and collegiate athletes (N = 2063) completed 2 of 3 CNTs each during preseason evaluations. All possible pairings were present across the sample, and the order of administration was randomized. Examiners provided 1-on-1, scripted pretest instructions, emphasizing the importance of good effort. Profile validity was determined by the manufacturers' standard criteria. RESULTS: The overall percentage of tests flagged as of questionable validity was lowest for ImPACT (2.7%) and higher for ANAM and Axon (10.7% and 11.3%, respectively). The majority of invalid baseline profiles were flagged as such because of failure on only 1 validity criterion. Several athlete and testing factors (eg, attention deficit hyperactivity disorder [ADHD], estimated general intellectual ability, administration order) predicted validity status for 1 or more CNTs. Considering only first CNT administrations and participants without ADHD and/or a learning disability (n = 1835) brought the rates of invalid baseline performances to 2.1%, 8.8%, and 7.0% for ImPACT, ANAM, and Axon, respectively. Invalid profiles on the Medical Symptom Validity Test (MSVT) were rare (1.8% of participants) and demonstrated poor correspondence to CNT validity outcomes. CONCLUSION: The validity criteria for these CNTs may not identify the same causes of invalidity or be equally sensitive to effort. The validity indicators may not be equally appropriate for some athletes (eg, those with neurodevelopmental disorders). CLINICAL RELEVANCE: The data suggest that athletes do not put forth widespread low effort or that some validity criteria are more sensitive to invalid performance than others. It is important for examiners to be aware of the conditions that maximize the quality of baseline assessments and to understand what sources of invalid performance are captured by the validity criteria that they obtain.
BACKGROUND: Preseason baseline testing using computerized neurocognitive tests (CNTs) is increasingly performed on athletes. Adequate effort is critical to establish valid estimates of ability, but many users do not evaluate performance validity, and the conditions that affect validity are not well understood across the available CNTs. PURPOSE: To examine the rates and predictors of invalid baseline performance for 3 popular CNTs: Automated Neuropsychological Assessment Metrics (ANAM), Axon Sports, and Immediate Post-Concussion and Cognitive Testing (ImPACT). STUDY DESIGN: Controlled laboratory study. METHODS: High school and collegiate athletes (N = 2063) completed 2 of 3 CNTs each during preseason evaluations. All possible pairings were present across the sample, and the order of administration was randomized. Examiners provided 1-on-1, scripted pretest instructions, emphasizing the importance of good effort. Profile validity was determined by the manufacturers' standard criteria. RESULTS: The overall percentage of tests flagged as of questionable validity was lowest for ImPACT (2.7%) and higher for ANAM and Axon (10.7% and 11.3%, respectively). The majority of invalid baseline profiles were flagged as such because of failure on only 1 validity criterion. Several athlete and testing factors (eg, attention deficit hyperactivity disorder [ADHD], estimated general intellectual ability, administration order) predicted validity status for 1 or more CNTs. Considering only first CNT administrations and participants without ADHD and/or a learning disability (n = 1835) brought the rates of invalid baseline performances to 2.1%, 8.8%, and 7.0% for ImPACT, ANAM, and Axon, respectively. Invalid profiles on the Medical Symptom Validity Test (MSVT) were rare (1.8% of participants) and demonstrated poor correspondence to CNT validity outcomes. CONCLUSION: The validity criteria for these CNTs may not identify the same causes of invalidity or be equally sensitive to effort. The validity indicators may not be equally appropriate for some athletes (eg, those with neurodevelopmental disorders). CLINICAL RELEVANCE: The data suggest that athletes do not put forth widespread low effort or that some validity criteria are more sensitive to invalid performance than others. It is important for examiners to be aware of the conditions that maximize the quality of baseline assessments and to understand what sources of invalid performance are captured by the validity criteria that they obtain.
Authors: Philip Schatz; Jamie E Pardini; Mark R Lovell; Michael W Collins; Kenneth Podell Journal: Arch Clin Neuropsychol Date: 2005-09-06 Impact factor: 2.813
Authors: Ruben J Echemendia; Jared M Bruce; Christopher M Bailey; James Forrest Sanders; Peter Arnett; Gray Vargas Journal: Clin Neuropsychol Date: 2012-09-25 Impact factor: 3.535
Authors: Paul McCrory; Willem Meeuwisse; Mark Aubry; Bob Cantu; Jiri Dvorak; Ruben J Echemendia; Lars Engebretsen; Karen Johnston; Jeffrey S Kutcher; Martin Raftery; Allen Sills; Brian W Benson; Gavin A Davis; Richard G Ellenbogen; Kevin M Guskiewicz; Stanley A Herring; Grant Iverson; Barry D Jordan; James Kissick; Michael McCrea; Andrew S McIntosh; David L Maddocks; Michael Makdissi; Laura Purcell; Margot Putukian; Michael Turner; Kathryn Schneider; Charles H Tator Journal: Clin J Sport Med Date: 2013-03 Impact factor: 3.638
Authors: Tresa M Roebuck-Spencer; Andrea S Vincent; Kirby Gilliland; Dan R Johnson; Douglas B Cooper Journal: Arch Clin Neuropsychol Date: 2013-07-24 Impact factor: 2.813
Authors: Philip Schatz; Timothy Kelley; Summer D Ott; Gary S Solomon; R J Elbin; Kate Higgins; Rosemarie Scolaro Moser Journal: J Athl Train Date: 2014-08-27 Impact factor: 2.860
Authors: Yang Wang; Lindsay D Nelson; Ashley A LaRoche; Adam Y Pfaller; Andrew S Nencka; Kevin M Koch; Michael A McCrea Journal: J Neurotrauma Date: 2015-11-12 Impact factor: 5.269
Authors: Lindsay D Nelson; Sergey Tarima; Ashley A LaRoche; Thomas A Hammeke; William B Barr; Kevin Guskiewicz; Christopher Randolph; Michael A McCrea Journal: Neurology Date: 2016-04-20 Impact factor: 9.910
Authors: Lindsay D Nelson; Ashley A LaRoche; Adam Y Pfaller; E Brooke Lerner; Thomas A Hammeke; Christopher Randolph; William B Barr; Kevin Guskiewicz; Michael A McCrea Journal: J Int Neuropsychol Soc Date: 2016-01 Impact factor: 2.892
Authors: Melissa A Lancaster; Daniel V Olson; Michael A McCrea; Lindsay D Nelson; Ashley A LaRoche; L Tugan Muftuler Journal: Hum Brain Mapp Date: 2016-11 Impact factor: 5.038
Authors: Lindsay D Nelson; Robyn E Furger; Jana Ranson; Sergey Tarima; Thomas A Hammeke; Christopher Randolph; William B Barr; Kevin Guskiewicz; Christopher M Olsen; E Brooke Lerner; Michael A McCrea Journal: J Neurotrauma Date: 2017-11-17 Impact factor: 5.269
Authors: Melissa A Lancaster; Timothy B Meier; Daniel V Olson; Michael A McCrea; Lindsay D Nelson; L Tugan Muftuler Journal: Hum Brain Mapp Date: 2018-07-02 Impact factor: 5.038
Authors: Suosuo Yang; Benjamin Flores; Rotem Magal; Kyrsti Harris; Jonathan Gross; Amy Ewbank; Sasha Davenport; Pablo Ormachea; Waleed Nasser; Weidong Le; W Frank Peacock; Yael Katz; David M Eagleman Journal: PLoS One Date: 2017-07-07 Impact factor: 3.240