S M Rao1,2, R Galioto1, M Sokolowski2, M McGinley1, J Freiburger2, M Weber1, T Dey3, L Mourany1, D Schindler4,5, C Reece2, D M Miller1, F Bethoux1, R A Bermel1, J R Williams6, N Levitt6, G A Phillips7, J K Rhodes8, J Alberts4, R A Rudick6. 1. Mellen Center for Multiple Sclerosis, Neurological Institute, Cleveland Clinic Foundation, Cleveland, OH, USA. 2. Lou Ruvo Center for Brain Health, Neurological Institute, Cleveland Clinic Foundation, Cleveland, OH, USA. 3. Department of Quantitative Health Sciences, Learner Research Institute, Cleveland Clinic Foundation, Cleveland, OH, USA. 4. Department of Biomedical Engineering, Lerner Research Institute, Cleveland Clinic Foundation, Cleveland, OH, USA. 5. Qr8Health, Boston, MA, USA. 6. Biogen, Cambridge, MA, USA. 7. Akcea Therapeutics, Cambridge, MA, USA. 8. FORMA Therapeutics, Watertown, MA, USA.
Abstract
BACKGROUND AND PURPOSE: The purpose was to determine the test-retest reliability, practice effects, convergent validity and sensitivity to multiple sclerosis (MS) disability of neuroperformance subtests from the patient self-administered Multiple Sclerosis Performance Test (MSPT) designed to assess low contrast vision (Contrast Sensitivity Test, CST), upper extremity motor function (Manual Dexterity Test, MDT) and lower extremity motor function (Walking Speed Test, WST) and to introduce the concept of regression-based norms to aid clinical interpretation of performance scores using the MSPT cognition test (Processing Speed Test, PST) as an example. METHODS: Substudy 1 assessed test-retest reliability, practice effects and convergent validity of the CST, MDT and WST in 30 MS patients and 30 healthy controls. Substudy 2 examined sensitivity to MS disability in over 600 MS patients as part of their routine clinic assessment. Substudy 3 compared performance on the PST in research volunteers and clinical samples. RESULTS: The CST, MDT and WST were shown to be reliable, valid and sensitive to MS outcomes. Performance was comparable to technician-administered testing. PST performance was poorer in the clinical sample compared with the research volunteer sample. CONCLUSIONS: The self-administered MSPT neuroperformance modules produce reliable, objective metrics that can be used in clinical practice and support outcomes research. Published studies which require patient voluntary consent may underestimate the rate of cognitive dysfunction observed in a clinical setting.
BACKGROUND AND PURPOSE: The purpose was to determine the test-retest reliability, practice effects, convergent validity and sensitivity to multiple sclerosis (MS) disability of neuroperformance subtests from the patient self-administered Multiple Sclerosis Performance Test (MSPT) designed to assess low contrast vision (Contrast Sensitivity Test, CST), upper extremity motor function (Manual Dexterity Test, MDT) and lower extremity motor function (Walking Speed Test, WST) and to introduce the concept of regression-based norms to aid clinical interpretation of performance scores using the MSPT cognition test (Processing Speed Test, PST) as an example. METHODS: Substudy 1 assessed test-retest reliability, practice effects and convergent validity of the CST, MDT and WST in 30 MS patients and 30 healthy controls. Substudy 2 examined sensitivity to MS disability in over 600 MS patients as part of their routine clinic assessment. Substudy 3 compared performance on the PST in research volunteers and clinical samples. RESULTS: The CST, MDT and WST were shown to be reliable, valid and sensitive to MS outcomes. Performance was comparable to technician-administered testing. PST performance was poorer in the clinical sample compared with the research volunteer sample. CONCLUSIONS: The self-administered MSPT neuroperformance modules produce reliable, objective metrics that can be used in clinical practice and support outcomes research. Published studies which require patient voluntary consent may underestimate the rate of cognitive dysfunction observed in a clinical setting.
Authors: Le H Hua; Carrie M Hersh; Fan Tian; Ellen M Mowry; Kathryn C Fitzgerald Journal: Mult Scler Relat Disord Date: 2020-11-23 Impact factor: 4.339
Authors: Carrie M Hersh; Bernd Kieseier; Carl de Moor; Deborah M Miller; Denise Campagnolo; James R Williams; Kathryn C Fitzgerald; Kuangnan Xiong; Marisa P McGinley; Megan Hyland; Richard A Rudick; Tjalf Ziemssen; Irene Koulinska Journal: Mult Scler J Exp Transl Clin Date: 2021-04-15
Authors: Carrie M Hersh; Arman Altincatal; Nicholas Belviso; Shivani Kapadia; Carl de Moor; Richard Rudick; James Rhys Williams; Catherine Miller; Irene Koulinska Journal: Mult Scler J Exp Transl Clin Date: 2022-01-07
Authors: Ellen M Mowry; Robert A Bermel; James R Williams; Tammie L S Benzinger; Carl de Moor; Elizabeth Fisher; Carrie M Hersh; Megan H Hyland; Izlem Izbudak; Stephen E Jones; Bernd C Kieseier; Hagen H Kitzler; Lauren Krupp; Yvonne W Lui; Xavier Montalban; Robert T Naismith; Jacqueline A Nicholas; Fabio Pellegrini; Alex Rovira; Maximilian Schulze; Björn Tackenberg; Mar Tintore; Madalina E Tivarus; Tjalf Ziemssen; Richard A Rudick Journal: Front Neurol Date: 2020-08-07 Impact factor: 4.003