Literature DB >> 28799444

The MSOAC approach to developing performance outcomes to measure and monitor multiple sclerosis disability.

Nicholas G LaRocca1, Lynn D Hudson2, Richard Rudick3, Dagmar Amtmann4, Laura Balcer5, Ralph Benedict6, Robert Bermel7, Ih Chang3, Nancy D Chiaravalloti8, Peter Chin9, Jeffrey A Cohen7, Gary R Cutter10, Mat D Davis11, John DeLuca8, Peter Feys12, Gordon Francis13, Myla D Goldman14, Emily Hartley2, Raj Kapoor15, Fred Lublin16, Gary Lundstrom2, Paul M Matthews17, Nancy Mayo18, Richard Meibach13, Deborah M Miller7, Robert W Motl19, Ellen M Mowry20, Rob Naismith21, Jon Neville2, Jennifer Panagoulias22, Michael Panzara22, Glenn Phillips3, Ann Robbins2, Matthew F Sidovar23, Kathryn E Smith1, Bjorn Sperling3, Bernard Mj Uitdehaag24, Jerry Weaver13.   

Abstract

BACKGROUND: The Multiple Sclerosis Outcome Assessments Consortium (MSOAC) was formed by the National MS Society to develop improved measures of multiple sclerosis (MS)-related disability.
OBJECTIVES: (1) To assess the current literature and available data on functional performance outcome measures (PerfOs) and (2) to determine suitability of using PerfOs to quantify MS disability in MS clinical trials.
METHODS: (1) Identify disability dimensions common in MS; (2) conduct a comprehensive literature review of measures for those dimensions; (3) develop an MS Clinical Data Interchange Standards Consortium (CDISC) data standard; (4) create a database of standardized, pooled clinical trial data; (5) analyze the pooled data to assess psychometric properties of candidate measures; and (6) work with regulatory agencies to use the measures as primary or secondary outcomes in MS clinical trials.
CONCLUSION: Considerable data exist supporting measures of the functional domains ambulation, manual dexterity, vision, and cognition. A CDISC standard for MS ( http://www.cdisc.org/therapeutic#MS ) was published, allowing pooling of clinical trial data. MSOAC member organizations contributed clinical data from 16 trials, including 14,370 subjects. Data from placebo-arm subjects are available to qualified researchers. This integrated, standardized dataset is being analyzed to support qualification of disability endpoints by regulatory agencies.

Entities:  

Keywords:  MS disability; clinical trial database; data standards; performance outcome measures; regulatory qualification

Mesh:

Year:  2017        PMID: 28799444      PMCID: PMC6174619          DOI: 10.1177/1352458517723718

Source DB:  PubMed          Journal:  Mult Scler        ISSN: 1352-4585            Impact factor:   6.312


Introduction

The need for better measures of MS disability has been recognized for decades. In 1993, the National Multiple Sclerosis Society (NMSS) convened an international workshop on the topic.[1] One result was a task force, charged with recommending outcome assessment methods that might improve on the Kurtzke[2] Expanded Disability Status Scale (EDSS). The task force recommended quantitative neurological performance testing as opposed to clinical rating scales such as EDSS, largely because performance outcome measures (PerfOs) have superior psychometric properties. The task force recommended the Timed 25-Foot Walk (T25FW) as a measure of walking speed, the 9-Hole Peg Test (9HPT) as an upper extremity dexterity measure, and the Paced Auditory Serial Addition Test (PASAT; 3-second version) as a measure of cognitive processing speed.[3] The task force also urged the academic community to develop a test for visual function, because high contrast letter acuity was not sensitive to change. The task force recommended inclusion of three PerfOs, together called the Multiple Sclerosis Functional Composite (MSFC) for inclusion in future trials. What followed was inclusion of MSFC in most prospectively designed clinical trials conducted by industry and academia and development of Low Contrast Letter Acuity (LCLA) as a more sensitive measure of MS-related visual impairment.[4] Many placebo-controlled clinical trials demonstrated treatment effects on the MSFC score. However, complexities related to the reference population used to create standardized scores and difficulty assigning clinical meaningfulness to z-score changes limited use of the MSFC as a primary outcome measure for registration trials.[5,6] In view of perceived limitations of the MSFC approach, and with recognition of the continuing need for better clinical measures of MS-related disability, the Multiple Sclerosis Outcome Assessments Consortium (MSOAC) was established in 2012 to accelerate the development of therapies for MS.[7] MSOAC established the concept of interest (COI) for meaningful treatment benefit as “MS disability,” or simply “disability,” characterized as neurological or neuropsychological impairments that result in limitations in activities and restrictions in participation or life roles, caused by MS, that are understood to be important by the person with MS. Frequent interactions with the European Medicines Agency (EMA) and the U.S. Food and Drug Administration (FDA) served to shape the consortium’s research plan and guide efforts to select PerfOs (https://www.fda.gov/downloads/drugs/guidances/ucm230597.pdf) and to determine suitability of using specific PerfOs to quantify MS disability in MS clinical trials. The context of use (COU) for the selected PerfO was use as primary or secondary endpoints in clinical trials of treatments intended to slow or stop the worsening of disability in MS. MSOAC first defined a conceptual framework for disability measures in MS, drawing on the International Classification of Functioning, Disability and Health (ICF) core sets for MS.[8] Early on, MSOAC members highlighted the need for a visual measure to include as part of a multi-dimensional outcome measure and expressed a preference for the Symbol Digit Modalities Test (SDMT) over PASAT as a measure of processing speed because of accumulating experience with both tests. Also, MSOAC members agreed to focus on dimensions of MS that lent themselves to simple, objective, and reliable measurement and not focus on crucial dimensions of MS (e.g. pain, fatigue) that were inherently patient self-reported. A systematic literature review was conducted to assess published evidence on measures for walking speed, manual dexterity, vision, and information processing speed. This literature provided support for the ability of performance measures for these domains to capture how people with MS feel and function.[9-12] Key to the MSOAC goal is analysis of the prospectively acquired data from multiple clinical trials. This paper details the methods used to establish the MSOAC database and the Statistical Analysis Plan (SAP) that is presently being applied to assess the clinical meaningfulness of different performance measures. Future papers will report on the results of these analyses.

Methods and initial results

Establishing a consortium

MSOAC is organized and managed by the Critical Path Institute (C-Path; https://c-path.org/programs/msoac/). With input from NMSS, C-Path established the membership agreements and engaged a wide spectrum of stakeholders, including persons with MS, advocacy organizations, clinical researchers, industry sponsors, regulators, and other governmental agencies, all working together with standard development organizations, contract research organizations, and data managers (Supplementary Table 1). C-Path supplied expertise for development of therapeutic area data standards and the remapping of legacy data to the Clinical Data Interchange Standards Consortium (CDISC) data standard accepted by the FDA and used by C-Path for analytic purposes. C-Path staff also provided regulatory expertise to guide each step through the FDA’s[13] Drug Development Tool and EMA’s Novel Methodologies qualification processes for PerfO qualification (Table 1).
Table 1.

Glossary.

AcronymTerm
9HPT9-Hole Peg Test—a brief, standardized, quantitative test of upper extremity function
ADaMAnalysis Data Tabulation Model—the CDISC model that defines standards for analysis datasets
BDIBeck Depression Inventory—a 21-question multiple-choice self-report inventory to measure the severity of depression
CDASHClinical Data Acquisition Standards Harmonization—describes the recommended data collection fields for 16 domains, including demographics, adverse events, and other domains common to most therapeutic areas and clinical research phases (https://www.cdisc.org/standards/foundational/cdash)
CDECommon Data Element—data element that is common to multiple datasets across different studies: https://www.commondataelements.ninds.nih.gov/MS.aspx#tab=Data_StandardsNLM/NIH
CDISCClinical Data Interchange Standards Consortium is a non-profit Standards Development Organization (SDO): https://www.cdisc.org
CFASTCoalition For Accelerating Standards and Therapies—formed by CDISC and C-Path to focus on therapeutic area data standards and analysis standards: https://www.cdisc.org/partnerships/cfast
ClinROClinician-reported outcome measure—a measurement based on a report that comes from a trained healthcare professional after observation of a patient’s health condition. Most ClinRO measures involve a clinical judgment or interpretation of the observable signs, behaviors, or other manifestations related to a disease or condition.[14]
COAClinical Outcome Assessment—assessment of a clinical outcome can be made through report by a clinician, a patient, a non-clinician observer or through a performance-based assessment. There are four types of COAs: clinician-reported outcome, observer-reported outcome, patient-reported outcome, and performance outcome.[14]
COIConcept(s) of Interest (COI) for meaningful treatment benefit—a description of the meaningful aspect of patient experience that will represent the intended benefit of treatment (e.g. presence/severity of symptoms, limitations in performance of daily activities).[14]
COUContext of Use—A statement that fully and clearly describes the way the medical product development tool is to be used and the medical product development-related purpose of the use.[14]
EDSSExpanded Disability Status Scale—a clinician-reported outcome measure of disability in MS
FSSFunctional Systems Scores—a clinician-reported measure of pyramidal, cerebellar, brainstem, sensory, bowel and bladder, visual, and cerebral (or mental) activity
LCLALow Contrast Letter Acuity
MICMinimal Important Change—smallest change in score in the domain of interest which patients perceive as important.[15]
MIDMinimal Important Difference—the difference observed between groups that are known to differ on the construct of interest in an important way.[15]
MSMultiple Sclerosis
MSFCMultiple Sclerosis Functional Composite—a three-part, standardized, quantitative assessment instrument for assessing mobility, dexterity and cognition in clinical studies of MS
PASATPaced Auditory Serial Addition Test—a test used to assess capacity and rate of information processing and sustained and divided attention
PPMSPrimary Progressive Multiple Sclerosis
PerfOPerformance Outcome Measure—a measurement based on a task(s) performed by a patient according to instructions that is administered by a healthcare professional.[14]
PROPatient-Reported Outcome Measure—a measurement based on a report that comes directly from the patient (i.e. study subject) about the status of a patient’s health condition without amendment or interpretation of the patient’s response by a clinician or anyone else.[14]
QualificationQualification—a conclusion, based on a formal regulatory process that within the stated context of use, a medical product development tool can be relied upon to have a specific interpretation and application in medical product development and regulatory review.[14]
RRMSRelapsing Remitting Multiple Sclerosis
SAPStatistical Analysis Plan
SDMTSymbol Digit Modalities Test
SDTMStudy Data Tabulation Model—provides a standardized, predefined collection of domains for clinical data submissions
SF-36Short Form (36) Health Survey—a 36-item patient-reported survey of patient health
SPMSSecondary Progressive Multiple Sclerosis
T25FWTimed 25-Foot Walk—a quantitative mobility and leg function performance test based on a timed 25-walk
TAPSCTherapeutic Area standards Program Steering Committee—an operations group in CFAST focused on therapeutic area data standards
WHO ICFWorld Health Organization International Classification of Functioning, Disability and Health
Glossary. In addition to contributing data, many MSOAC members participated in a Coordinating Committee, which served as the governing body. Working groups were established to focus on (1) Defining Disability, (2) Data Standards and Integration, (3) Clinical Outcome Assessments, (4) Regulatory, (5) Literature Review, (6) Statistics, and (7) Voice of the Patient (VOP).

Selecting domains of function from the ICF core sets for MS

In a series of in-person meetings and teleconferences, the Defining Disability Workgroup examined ICF domains for MS.[8] Inclusion and exclusion criteria (Table 2) were developed and applied to the ICF domains and to the associated measures of those domains. The Workgroup used the specified COU to provide a contextual anchor for the selection process. An important component of this process was the Workgroup’s mapping of the ICF domains to activities of daily living that are limited by MS. Several rounds of reviews were needed to reduce the candidate domains to a smaller set of finalist domains. The Workgroup then utilized a numerical rating system to arrive at a consensus concerning the most appropriate domains to be considered. This final set was then discussed with the Coordinating Committee, which endorsed the recommendations. The Workgroup then proceeded to identify the most appropriate performance measures to assess each of the domains.
Table 2.

Inclusion and exclusion criteria for selecting ICF domains.

Inclusion criteriaExclusion criteria
A domain must represent something related to MS that affects a significant proportion of people with MS.The domain is not thought to relate to activities, functions, or roles that are important to people with MS in their everyday lives.
A domain must be something that can be measured objectively and that does not rely entirely on patient-reported symptoms.The domain is not commonly affected in people with MS (e.g. hearing).
A domain must be something that can be measured easily, with minimal equipment, and in a reasonable amount of time.The domain does not change over time or vary depending on MS severity.
A domain must be something that affects a real-life function that is meaningful to a person with MS.The domain cannot be objectively assessed (e.g. fatigue or pain).
A domain should preferably be one for which accessible data exist from MS clinical trials.The function related to the domain cannot be quantified or cannot be measured using practical test procedures (e.g. sexual function).
Inclusion and exclusion criteria for selecting ICF domains. Domains selected from the core and comprehensive ICF domains are shown in Table 3. Domains that did not represent common MS symptoms were eliminated with the understanding that the domain may be affected in a limited number of MS patients. Because the workgroup was only focused on objectively measurable domains, domains that could only be assessed by patient reports were eliminated with the understanding that certain of these domains (e.g. fatigue, depression) represent significant issues in MS. Given the COU, that is, large clinical trials, certain domains (e.g. gait pattern functions) were considered of value but too complex to incorporate in such studies. Both memory and speed of information processing were considered for inclusion as measures of cognition. Evidence from the literature indicated that speed of information processing is involved in memory and has a stronger relationship to real-life activities such as employment. Therefore, speed of information processing was selected as the most useful cognitive domain. The final domains selected reflect a core set of real-life functions meaningful to MS patients for which data exist in clinical trial datasets and in the scientific literature.
Table 3.

Activities of daily living limited by disability in MS mapped to ICF domains.

Activities of daily living limited by disability in MSBodily function involved in the activity of daily livingICF domain[a]CommentsPossible neuro performance measures
1. Remembering to take medications[8,16]Cognition: learning, retention, and recall of informationb144One of the most frequently impaired cognitive functions in MS patients but complex and time-consuming to measure.1. California Verbal Learning Test2. Brief Visuospatial Reminder Test3. 7/24 Spatial Recall Test
2. Keeping up with conversations[8,1719]Cognition: speed and accuracy of processing informationb1600b164Very practical to measure with a good deal of literature to support it. In addition, it is a function that MS patients complain about and that is related to activities and participation.1. Symbol Digit Modalities Test2. Paced Serial Additions Test
3. Seeing someone crossing the street[8,20,21]Vision: recognizing people and objectsb120Basic to many daily activities and practical, sensitive tests are available.Low Contrast Letter Acuity
4. Reading a newspaper[8]Vision: readingb120Basic to many daily activities and practical, sensitive tests are available.Low Contrast Letter Acuity
5. Walking quickly to be on time for an appointment[8,2224]Ambulation: walking at different speedsd450b730Frequently affected in MS and easily measured in varied clinical settings.Timed 25-Foot Walk
6. Using a knife and fork, writing, and using a computer keyboard[8,2527]Coordination: fine hand used440d445b760Often affected in MS and can interfere with a wide variety of important daily functions9-Hole Peg Test

References in column 1 document the importance of the bodily function to people living with MS.

ICF Brief Core and Comprehensive Domains: b120: seeing functions; b760: voluntary movement functions; b144: memory; d440: fine hand use; b164: higher level cognitive functions; d445: hand and arm use; b730: muscle power functions; d450: walking; b1600: pace of thought.

Activities of daily living limited by disability in MS mapped to ICF domains. References in column 1 document the importance of the bodily function to people living with MS. ICF Brief Core and Comprehensive Domains: b120: seeing functions; b760: voluntary movement functions; b144: memory; d440: fine hand use; b164: higher level cognitive functions; d445: hand and arm use; b730: muscle power functions; d450: walking; b1600: pace of thought.

Literature review and extraction methods

A related activity to further define the COI of disability in MS focused on the four domains selected by the Defining Disability Workgroup: ambulation, arm dexterity, vision, and cognition. Research questions were developed (Table 4) that could be addressed through an extensive literature review. Search parameters were designed to identify articles on performance measures relevant to domains of interest. In addition to the T25FW, 9HPT, LCLA, and SDMT, alternate measures used in the four domains were included in the literature search as well as articles that would combine domains in a disability assessment.
Table 4.

Literature review research questions.

1. What are the most common symptoms or impairments caused by MS?
2. Which MS symptoms or impairments are the most challenging for people with MS?
3. Which symptoms or impairments are most likely to be altered by treatment and/or are predictive of future worsening?
4. What daily activities are compromised by each of the symptoms or impairments of MS?
5. What validated measures exist to evaluate MS symptoms or impairments?
6. What are the psychometric properties of these measures, including dimensionality, reliability (reproducibility, internal consistency, inter-rater agreement, etc.), validity, objectivity, sensitivity to differences and change, predictive validity, and clinical meaningfulness, among others?
7. How feasible (cost, complexity, timeliness, etc.) are these measures for use in large clinical trials?
8. How have these measures performed to date in the context of clinical trials?
9. How adequate is the published evidence supporting the utilization of these measures, based on standard criteria for level of evidence?
10. What is the evidence concerning what constitutes the size of a change or difference in each measure that is both perceptible to a person living with MS and that constitutes an important difference in day-to-day function?
Literature review research questions. The literature review was performed in three levels (Figure 1). Parameters and search terms were defined (Supplementary Table 2) in Level 1 and abstract filtering criteria (Supplementary Table 3) were applied during the Level 2 Review. In reviewing the initial search, the Literature Review Workgroup identified a number of key papers that had been missed because key words and abstracts did not always include the performance measure search terms. Alternative search criteria increased the number of abstracts identified to approximately 9000. Broadening the search criteria captured the missing articles but also identified many other articles unrelated to the scope of the project. The Literature Review Workgroup decided to use an enrichment technique that allowed the addition of subject matter experts (SMEs)-recommended papers that should definitely be in the review. This combined “enriched search” approach identified approximately 3000 papers.
Figure 1.

Overview of literature review results.

The literature review was performed in three consecutive levels. For Level 2, the workgroup applied a structured algorithm to review abstracts and categorize each article into “Included” or “Excluded” (see Abstract Filtering Criteria in Supplementary Table 3), with reasons for exclusion. Abstracts that included a measure of cognition received a code of “keep” or “exclude” only. Abstracts that included measures of vision, manual dexterity, or ambulation received a code of “high,” “medium,” and “low” importance or “exclude.” All articles meeting the inclusion criteria were entered into the Data Extraction Table (Supplementary Table 4) and ranked to identify articles of most relevance to study objectives. Additional exclusions were also identified at this stage. For those articles that met the inclusion criteria, subject matter experts (SMEs) carried out the initial analysis and a separate group reviewed the analysis. A meeting was held to adjudicate any differences in the determinations of the SMEs. The Data Extraction Table (Supplementary Table 4) contains information on a total of 564 reviewed publications.

Overview of literature review results. The literature review was performed in three consecutive levels. For Level 2, the workgroup applied a structured algorithm to review abstracts and categorize each article into “Included” or “Excluded” (see Abstract Filtering Criteria in Supplementary Table 3), with reasons for exclusion. Abstracts that included a measure of cognition received a code of “keep” or “exclude” only. Abstracts that included measures of vision, manual dexterity, or ambulation received a code of “high,” “medium,” and “low” importance or “exclude.” All articles meeting the inclusion criteria were entered into the Data Extraction Table (Supplementary Table 4) and ranked to identify articles of most relevance to study objectives. Additional exclusions were also identified at this stage. For those articles that met the inclusion criteria, subject matter experts (SMEs) carried out the initial analysis and a separate group reviewed the analysis. A meeting was held to adjudicate any differences in the determinations of the SMEs. The Data Extraction Table (Supplementary Table 4) contains information on a total of 564 reviewed publications. Based on the results from the literature review, the SDMT was selected as the measure of choice[9,28,29] for processing speed. The workgroup considered potential measures of vision and decided on LCLA, utilizing 1.25% and 2.5% contrast Sloan charts, based on its strong performance in recent clinical trials.[11] Walking was considered as essential for inclusion, and the workgroup decided on walking speed as the most appropriate measure based in large part on the extensive use of the T25FW in clinical trials as part of the MSFC.[12,30] Finally, the workgroup endorsed the inclusion of a measure of manual dexterity to assess upper extremity function including coordination. The 9HPT, also part of the MSFC, was endorsed as the most appropriate measure in this domain based on its successful use in numerous clinical trials.[10,30] Articles analyzed by the Literature Review Workgroup (see Data Extraction Table, Supplementary Table 4) were drawn on for recently published review articles, which summarized the utility and validity of each recommended measure.[9-12] Authors of the reviews determined which of the identified articles should be included in the reviews. Using the vision domain as an example, the actual search parameters were not specifically designed to assess vision in all its aspects in MS, nor even LCLA when used in other non-MS settings. In addition, background and technical references (e.g. information on optical coherence tomography (OCT), visual evoked potentials (VEP) etc.) were included in the vision publication that support the use of LCLA in MS but that were not part of the formal literature search. A similar approach was used for published reviews of the other domains.

Developing a CDISC therapeutic area data standard for MS

To allow aggregation of data from clinical trials, a common data standard for MS had to be developed and data from each trial remapped to that standard. The process for creating the first MS data standard was instituted through the Coalition for Accelerating Standards and Therapies (CFAST), an initiative formed by CDISC and C-Path to create and maintain data standards in therapeutic areas important to public health. In general, the process mirrors that of other Standards Development Organizations (SDOs), including the International Organization for Standardization (ISO), Health Level 7 (HL7), and Integrating the Healthcare Enterprise (IHE). In brief, the sequential steps include scoping/charter, modeling and producing a draft standard, initial review and comment disposition, final public review and comment, disposition, and publication. The comment disposition ensures that those who contribute to the development process know how the comments were resolved to produce the resulting consensus-based standard. C-Path submitted the scoping proposal for approval to the Therapeutic Area Program Steering Committee (TAPSC) that is organized by CFAST. The scoping proposal included a brief description of the project, including background information and proposed deliverables. Following approval of the scoping proposal, a detailed project proposal was submitted to TAPSC. The charter contained detailed information on the proposed standard, including focus populations, proposed team members/roles and other resources, stakeholder engagement considerations, concepts in scope, and a gap analysis of these concepts versus existing CDISC standards. The TAPSC reviewed and approved the charter. The Data Standards and Integration Workgroup developed the concept model and drafted the data standard, which was subsequently subjected to two rounds of review, including a public comment process. Revisions were incorporated, and a separate group of data standard experts carried out the final review and approval. The Workgroup drew on the information content of the “common data elements (CDEs)” for MS that were developed through National Institute of Neurological Disorders and Stroke (NINDS)-supported efforts to identify those biomedical concepts that would form the MS CDISC data standard. Though CDEs guide researchers with recommendations on what should be captured and ensure consistent definitions of the captured content, they do not stand alone as data standards. A complete data standard also specifies how the collected data are represented in a database. Data standard specifications must also account for the often complex relationships between individual data elements to ensure that reviewers can construct accurate analyses involving multiple data elements which may exist in more than one table in the database. Some of the retained CDEs were also further refined into comprehensive concepts—represented visually as concept maps—that described their origins in study-related processes and their interrelationships to other data elements. One such concept map (“relapse”) is presented in Figure 2. Standard development workgroup discussions revealed that multiple pathways can lead to the conclusion that relapse has occurred in patients with MS, including variations in relapse criteria and criteria for determining severity. These criteria are typically, but not always, anchored on changes in EDSS score. The resulting data standard accommodates this variation and specifies where in the CDISC data model (SDTM) this information can be found (represented by yellow boxes). In the CDISC Therapeutic Area User Guide (TAUG) for MS v1.0, this concept map is followed by more explicit mock data examples showing how these data are represented and how they are linked to each other in a relational database (http://www.cdisc.org/therapeutic#MS; Figure 3).
Figure 2.

A concept map representing relapse in MS.

Multiple pathways can lead to the conclusion that relapse has occurred in patients with MS, including variations in relapse criteria and criteria for determination of severity. These criteria are typically anchored on changes in EDSS score. The resulting data standard accommodates this variation and specifies where in the CDISC data model (SDTM) this information can be found (represented by yellow boxes). CE is the Clinical Events domain; TM is the Trial Milestones domain; TS is the Trial Summary domain. Symbols correspond to the Bioinformatics Research Information Domain Grid (BRIDG) model concept classes (not discussed). Stars in orange represent the “Observation Result” class; circles in blue represent the “Assessor” class; and pentagons in yellow represent SDTM domains.

Figure 3.

Process used for development of CDISC data standards for MS, v1.0.

A working group of MS subject matter experts and CDISC data standards experts reviewed the NINDS Common Data Elements for MS (https://www.commondataelements.ninds.nih.gov/MS.aspx#tab=Data_StandardsNLM/NIH). Concepts extracted from these CDEs were either retained or eliminated based on their relevance and whether or not they were already represented in CDISC SDTM standards. The concepts retained formed the scope for v1.0 of the MS CDISC User Guide, and those were further developed by CDISC data modelers in consultation with clinical SMEs. Development work included concept maps, data modeling examples, and controlled terminology. Controlled terminology was developed in collaboration with the NCI Enterprise Vocabulary Service (EVS). Examples of controlled terminology developed for MS include more than 500 terms registered in support of coding the various items and scores for the clinical outcome assessments and functional tests developed as a part of this project. The draft user guide was assembled as the output of this working group.

A concept map representing relapse in MS. Multiple pathways can lead to the conclusion that relapse has occurred in patients with MS, including variations in relapse criteria and criteria for determination of severity. These criteria are typically anchored on changes in EDSS score. The resulting data standard accommodates this variation and specifies where in the CDISC data model (SDTM) this information can be found (represented by yellow boxes). CE is the Clinical Events domain; TM is the Trial Milestones domain; TS is the Trial Summary domain. Symbols correspond to the Bioinformatics Research Information Domain Grid (BRIDG) model concept classes (not discussed). Stars in orange represent the “Observation Result” class; circles in blue represent the “Assessor” class; and pentagons in yellow represent SDTM domains. Process used for development of CDISC data standards for MS, v1.0. A working group of MS subject matter experts and CDISC data standards experts reviewed the NINDS Common Data Elements for MS (https://www.commondataelements.ninds.nih.gov/MS.aspx#tab=Data_StandardsNLM/NIH). Concepts extracted from these CDEs were either retained or eliminated based on their relevance and whether or not they were already represented in CDISC SDTM standards. The concepts retained formed the scope for v1.0 of the MS CDISC User Guide, and those were further developed by CDISC data modelers in consultation with clinical SMEs. Development work included concept maps, data modeling examples, and controlled terminology. Controlled terminology was developed in collaboration with the NCI Enterprise Vocabulary Service (EVS). Examples of controlled terminology developed for MS include more than 500 terms registered in support of coding the various items and scores for the clinical outcome assessments and functional tests developed as a part of this project. The draft user guide was assembled as the output of this working group. During the development of the MS data standard, it was recognized that the organization of data within SDTM would benefit from the creation of two additional SDTM domains: (1) Functional Tests (FT), which includes performance measures such as the T25FW and 9HPT, and (2) Ophthalmic Examinations (OE), which includes the LCLA findings. The SDTM domains used for the MSOAC database are shown in Table 5.
Table 5.

Study data tabulation model (SDTM) domains used for the MSOAC database.

SDTM domainAbbreviationObservation classContents
Clinical EventsCEEventsMS symptoms and relapse events and other events
Concomitant/Prior MedicationsCMInterventionsBetaseron, dexamethasone, glatiramer acetate, interferon, methylprednisolone, prednisolone, prednisone, etc.
DemographicsDMSpecial purposeAge, gender, race, trial arm, country
DispositionDSEventsInformed consent, randomization, reason for early withdrawal
Findings About Clinical EventsFACEFindings sub-classNumber of relapses, relapses requiring hospitalization or steroids, result of relapse diagnosis tests, etc.
Findings About Medical HistoryFAMHFindings sub-classNumber of relapses 1, 2, or 3 years before study start or since MS diagnosis; experienced acute relapse
Functional TestsFTFindingsT25FW, 9HPT, PASAT, SDMT
Medical HistoryMHEventsMS diagnosis and pre-study symptoms, general medical history
Ophthalmic ExaminationsOEFindingsVisual acuity (low and high contrast)
Physical ExaminationPEFindingsGeneral physical exam
QuestionnairesQSFindingsBDI-FS, BDI-II, EDSS, FS scores, MSNQ, Neurological Change Questionnaire, RAND-36, SF-36, SF-12
Reproductive System FindingsRPFindingsPregnancy test
Subject CharacteristicsSCFindingsDominant hand
Subject Disease MilestonesSMSpecial purposeMS relapse events
Trial Disease MilestonesTMTrial designDefinitions of MS relapse

The Functional Tests (FT) and Ophthalmic Examinations (OE) domains were developed as a result of the therapeutic area data standard for MS. PASAT and SDMT are cognitive function tests that are included in the FT domain.

Study data tabulation model (SDTM) domains used for the MSOAC database. The Functional Tests (FT) and Ophthalmic Examinations (OE) domains were developed as a result of the therapeutic area data standard for MS. PASAT and SDMT are cognitive function tests that are included in the FT domain.

Acquiring, standardizing, and pooling data from MS clinical trials

To facilitate sharing of clinical trial data, C-Path developed two legal agreements that govern MSOAC membership and data contributions. Following execution of the legal agreements, MSOAC acquired 16 datasets from consortium industry and academic members (Table 6) and remapped the data to the new CDISC data standard for MS (Figure 4). The standardized data consisting of control and treatment arms of clinical trials formed the MSOAC database. The database includes information on a range of performance measures from 14,370 study subjects. Baseline descriptive statistics for age, sex, race, treatment arms, and disease severity as assessed by EDSS are shown in Figure 5.
Table 6.

Source datasets in the MSOAC database.

CT.gov no.[a]DescriptionNMS TypeEDSSFSST25FW9HPTPASATSDMTLCLASF-36BDI-II
NCT00027300AFFIRM939RRMSNONO
NCT00030966SENTINEL1196RRMSNO
NCT00127530MS-F203301ALLNONONONONONONO
NCT00134563TEMSO1086RRMSNONONO
NCT00211887COMBIRX1008RRMSNONO
NCT00289978FREEDOMS1272RRMSNONONONO
NCT00297232STRATA1094RRMSNONONONONOBDI-FS
NCT00340834TRANSFORMS1292RRMSNONONO
NCT00355134FREEDOMS II1083RRMSNONONO
NCT00483652MS-F204239ALLNONONONONONONO
NCT00530348CARE-MS 1563RRMSNONO
NCT00548405CARE-MS 2798RRMSNONO
NCT00869726MAESTRO610SPMSNONONO
NCT00906399ADVANCE1512RRMSSF-12
N/A[b]PROMISE943PPMSNONONO
N/A[b]IMPACT434SPMSNONO

The outcome measures that are included in the MSOAC database from each study are indicated by a check; No indicates that data from a particular measure are not included in the database. N is the number of subjects in a dataset; for the MS Type, “All” includes RRMS, SPMS, and PPMS.

CT.gov refers to the Clinical Trials.gov website where clinical trials are registered.

Study does not have a Clinical Trials.gov identifier.

Figure 4.

Steps in data mapping.

Data contributors agreed to provide data that meet the requirement of either a “Limited Dataset” or a “De-identified Dataset” in accordance with the Health Insurance Portability and Accountability Act (HIPAA) Safe Harbor requirements (45 CFR 164 Subpart E, https://www.gpo.gov/fdsys/pkg/CFR-2011-title45-vol1/pdf/CFR-2011-title45-vol1-part164.pdf). C-Path policy is to exceed that minimum requirement whenever possible by further de-identifying the data after receiving it so that, in addition to meeting HIPAA Safe Harbor requirements, the data do not contain the year portion (which is allowed by Safe Harbor) of any dates. Instead, the timing of events is represented as time in relation to the study medication start date (defined as study day 1).

Data were standardized using the CDISC Study Data Tabulation Model (SDTM). This process was carried out by C-Path data managers in three stages: (1) logically planning how the data would fit into SDTM, (2) programmatically remapping the data, and (3) validating the mapped data. The purpose of the validation step was two-fold: to ensure that the meaning of the data was accurately preserved and to check for SDTM compliance. The latter was done with the aid of the Pinnacle 21 Validator tool. Once all the data were in SDTM format, the data were pooled together into an aggregated dataset that was provided to the CRO, Premier Research.

Figure 5.

Baseline descriptive statistics for the pooled subjects in the MSOAC Database.

Through the data contribution agreement, contributors specified the level of access to the data (e.g. C-Path staff and contractors, MSOAC members) and which data they will contribute. Some contributors provided all of the data collected for a given study while others provided only the data relevant to the MSOAC analyses. Consequently, some information, including demographics on race and geographic region, is missing.

Source datasets in the MSOAC database. The outcome measures that are included in the MSOAC database from each study are indicated by a check; No indicates that data from a particular measure are not included in the database. N is the number of subjects in a dataset; for the MS Type, “All” includes RRMS, SPMS, and PPMS. CT.gov refers to the Clinical Trials.gov website where clinical trials are registered. Study does not have a Clinical Trials.gov identifier. Steps in data mapping. Data contributors agreed to provide data that meet the requirement of either a “Limited Dataset” or a “De-identified Dataset” in accordance with the Health Insurance Portability and Accountability Act (HIPAA) Safe Harbor requirements (45 CFR 164 Subpart E, https://www.gpo.gov/fdsys/pkg/CFR-2011-title45-vol1/pdf/CFR-2011-title45-vol1-part164.pdf). C-Path policy is to exceed that minimum requirement whenever possible by further de-identifying the data after receiving it so that, in addition to meeting HIPAA Safe Harbor requirements, the data do not contain the year portion (which is allowed by Safe Harbor) of any dates. Instead, the timing of events is represented as time in relation to the study medication start date (defined as study day 1). Data were standardized using the CDISC Study Data Tabulation Model (SDTM). This process was carried out by C-Path data managers in three stages: (1) logically planning how the data would fit into SDTM, (2) programmatically remapping the data, and (3) validating the mapped data. The purpose of the validation step was two-fold: to ensure that the meaning of the data was accurately preserved and to check for SDTM compliance. The latter was done with the aid of the Pinnacle 21 Validator tool. Once all the data were in SDTM format, the data were pooled together into an aggregated dataset that was provided to the CRO, Premier Research. Baseline descriptive statistics for the pooled subjects in the MSOAC Database. Through the data contribution agreement, contributors specified the level of access to the data (e.g. C-Path staff and contractors, MSOAC members) and which data they will contribute. Some contributors provided all of the data collected for a given study while others provided only the data relevant to the MSOAC analyses. Consequently, some information, including demographics on race and geographic region, is missing. As a resource for the research community, a database containing the placebo arms of MS clinical trials was also established (https://c-path.org/programs/msoac/). C-Path staff secured permission for the inclusion of ~2500 individual patient records that are part of the overall MSOAC database and developed the infrastructure to support the storage, security, access requests, data use agreements, and access approvals, including a standing Review Board. Baseline descriptive statistics of this placebo-arm database are shown in Figure 6.
Figure 6.

Baseline descriptive statistics for the pooled subjects in the placebo-arm database.

The data contribution agreement for each study stated whether the placebo-arm data could be made available to qualified researchers. For the studies where permission was granted, the standardized records for the placebo-arm subjects were copied and pooled together into a separate dataset. These records were further anonymized by removing the study IDs and using a random number generator to assign all of the subjects a new ID number. To request access to the database, use the following link: https://c-path.org/programs/msoac/.

Baseline descriptive statistics for the pooled subjects in the placebo-arm database. The data contribution agreement for each study stated whether the placebo-arm data could be made available to qualified researchers. For the studies where permission was granted, the standardized records for the placebo-arm subjects were copied and pooled together into a separate dataset. These records were further anonymized by removing the study IDs and using a random number generator to assign all of the subjects a new ID number. To request access to the database, use the following link: https://c-path.org/programs/msoac/.

Analyzing data in the MSOAC database

MSOAC’s Statistics Workgroup developed the SAP, incorporating both regulatory feedback and recommendations from MSOAC members on which functional domains to examine, the analyses to be performed, and the optimal approach for incorporating the VOP. MSOAC members living with MS reviewed the initial plans for measuring clinically meaningful aspects of disability and identified gaps in the approach. A literature review provided insights on what aspects of disability are of most importance to people with MS and what performance measures adequately capture those concepts. Four performance measures were selected for detailed analysis, based on literature review and availability of PerfOs in the MSOAC database: the T25FW for ambulation, the 9HPT for manual dexterity, LCLA for vision (1.25% and 2.5% contrast), and both the SDMT and the PASAT for cognition. As detailed in the supplementary material, the following attributes were assessed for each measure: floor or ceiling effects, test–retest reliability, change over time, construct validity, convergent validity, extent of practice effects, known-group validity, sensitivity to change, and the minimum clinically important change in performance scores. Both the placebo arm and the treatment arm of the aggregated data were used for the statistical analyses. Results from the statistical analyses will be reported separately.

Conclusion

MSOAC was formed to develop more sensitive methods to measure whether a drug effectively reduces disability worsening in MS, based on the belief that acceptance of a more sensitive and precise, yet meaningful measure would accelerate progress in developing effective MS therapies. Therefore, the primary purpose for MSOAC was to qualify disability performance measures as primary or secondary endpoints for MS clinical trials submitted to the FDA and the EMA. Qualification of the SDMT as a measure of information processing speed is underway at the FDA, and qualification of all four performance measures (SDMT, T25FW, 9HPT, and LCLA) is in process at the EMA. In addition, given the preference for simple, reproducible performance tests, the consortium recognized that the same outcome measures could be useful within medical practice to grade MS severity and monitor patients over time, potentially harmonizing the metrics used in clinical trials and clinical practice. By including the same operationally defined, quantitative measures in clinical trials and healthcare settings, it should be possible to use “real world data” to augment clinical trials, test interventions in less controlled settings, and realize the potential of the “learning health system.”[31] MSOAC is a global effort, with members from 5 advocacy organizations, 2 regulatory agencies and 1 other governmental agency, 12 pharmaceutical companies, 23 academic institutions, 4 consultant groups, and 4 non-profit organizations. By sharing data and expertise in teams of volunteers that reported to the MSOAC Coordinating Committee (i.e. the Data Standards and Integration Workgroup, Defining Disability Workgroup, Clinical Outcome Assessments Workgroup, Regulatory Advisory Workgroup, Literature Review Workgroup, Statistical Workgroup, and VOP Workgroup), consortium members have delivered the following: (1) the first TAUG for MS, which is freely available at http://www.cdisc.org/therapeutic#MS; (2) a standardized database of 14,370 trial subjects for use in qualification of new PerfOs; (3) a placebo-arm database for use by the research community; (4) an extensive review of the literature on performance measures relevant to MS;[9-12] and (5) analyses of performance measure data for submission to the FDA and the European Medicines Agency for qualification. An approach to assess clinical meaningfulness of differences in the four measures by directly engaging persons with MS is also underway. Termed the VOP, this effort will contribute evidence toward the clinical meaningfulness of walking speed, manual dexterity, visual acuity, and speed of information processing in the lives of people with MS. The consortium approach is not without challenges. Sharing data proved difficult or impossible for several members. All but one participant were willing to provide the needed copyright permissions for incorporation of scales into the MS CDISC data standard. Stakeholders were initially divided on the research plan, including optimal approaches to establish clinical meaningfulness. Generating a CDISC standard for MS was a milestone that allowed pooling of clinical trial data for MSOAC’s analysis and regulatory submission. The CDISC standard also provided a new tool to the entire MS community, which is of value now that all drug trial data submitted to the FDA[32] and Pharmaceuticals and Medical Devices Agency (PMDA) must be in CDISC format. Another consortium objective that benefits the research community was the creation of a separate database containing the placebo-arm data from registration trials (https://c-path.org/programs/msoac/). Most importantly, MSOAC’s proposed outcome measure, once qualified by the EMA and the FDA, will be adopted by drug developers to demonstrate treatment benefit of therapies designed to slow progression of disability and promote improvement in MS. MSOAC illustrates the potential for pre-competitive, cooperative, consortium-driven progress in drug development tools that benefit both sponsors and the broader MS community.
  28 in total

1.  Cognitive impairment in multiple sclerosis: An 18 year follow-up study.

Authors:  Lauren B Strober; Stephen M Rao; Jar-Chi Lee; Elizabeth Fischer; Richard Rudick
Journal:  Mult Scler Relat Disord       Date:  2014-04-13       Impact factor: 4.339

Review 2.  Multiple Sclerosis Outcome Assessments Consortium: Genesis and initial project plan.

Authors:  Richard A Rudick; Nicholas Larocca; Lynn D Hudson
Journal:  Mult Scler       Date:  2013-09-20       Impact factor: 6.312

3.  Unemployment in multiple sclerosis: the contribution of personality and disease.

Authors:  Lauren B Strober; Christopher Christodoulou; Ralph H B Benedict; Holly J Westervelt; Patricia Melville; William F Scherl; Bianca Weinstock-Guttman; Syed Rizvi; Andrew D Goodman; Lauren B Krupp
Journal:  Mult Scler       Date:  2011-12-19       Impact factor: 6.312

4.  The development of ICF Core Sets for multiple sclerosis: results of the International Consensus Conference.

Authors:  Michaela Coenen; Alarcos Cieza; Jenny Freeman; Fary Khan; Deborah Miller; Andrea Weise; Jürg Kesselring
Journal:  J Neurol       Date:  2011-03-04       Impact factor: 4.849

Review 5.  The multiple sclerosis functional composite: a clinically meaningful measure of disability.

Authors:  Chris H Polman; Richard A Rudick
Journal:  Neurology       Date:  2010-04-27       Impact factor: 9.910

6.  Walking speed, rather than Expanded Disability Status Scale, relates to long-term patient-reported impact in progressive MS.

Authors:  Lvae Bosma; J J Kragt; C H Polman; B M J Uitdehaag
Journal:  Mult Scler       Date:  2012-08-20       Impact factor: 6.312

7.  Contrast letter acuity as a visual component for the Multiple Sclerosis Functional Composite.

Authors:  L J Balcer; M L Baier; J A Cohen; M F Kooijmans; A W Sandrock; M L Nano-Schiavi; D C Pfohl; M Mills; J Bowen; C Ford; F R Heidenreich; D A Jacobs; C E Markowitz; W H Stuart; G-S Ying; S L Galetta; M G Maguire; G R Cutter
Journal:  Neurology       Date:  2003-11-25       Impact factor: 9.910

8.  The longitudinal relationship between the patient-reported Multiple Sclerosis Impact Scale and the clinician-assessed Multiple Sclerosis Functional Composite.

Authors:  L Costelloe; K O'Rourke; C McGuigan; C Walsh; N Tubridy; M Hutchinson
Journal:  Mult Scler       Date:  2007-10-17       Impact factor: 6.312

Review 9.  Clinical outcomes assessment in multiple sclerosis.

Authors:  R Rudick; J Antel; C Confavreux; G Cutter; G Ellison; J Fischer; F Lublin; A Miller; J Petkau; S Rao; S Reingold; K Syndulko; A Thompson; J Wallenberg; B Weinshenker; E Willoughby
Journal:  Ann Neurol       Date:  1996-09       Impact factor: 10.422

10.  Mapping the 12-item multiple sclerosis walking scale to the EuroQol 5-dimension index measure in North American multiple sclerosis patients.

Authors:  Matthew F Sidovar; Brendan L Limone; Soyon Lee; Craig I Coleman
Journal:  BMJ Open       Date:  2013-05-28       Impact factor: 2.692

View more
  20 in total

1.  The new Mobile Universal Lexicon Evaluation System (MULES): A test of rapid picture naming for concussion sized for the sidelines.

Authors:  Omar Akhand; Matthew S Galetta; Lucy Cobbs; Lisena Hasanaj; Nikki Webb; Julia Drattell; Prin Amorapanth; John-Ross Rizzo; Rachel Nolan; Liliana Serrano; Janet C Rucker; Dennis Cardone; Barry D Jordan; Arlene Silverio; Steven L Galetta; Laura J Balcer
Journal:  J Neurol Sci       Date:  2018-02-22       Impact factor: 3.181

2.  The Multiple Sclerosis Data Alliance Catalogue: Enabling Web-Based Discovery of Metadata from Real-World Multiple Sclerosis Data Sources.

Authors:  Lotte Geys; Tina Parciak; Ashkan Pirmani; Robert McBurney; Hollie Schmidt; Tanja Malbaša; Tjalf Ziemssen; Arnfin Bergmann; Juan I Rojas; Edgardo Cristiano; Juan Antonio García-Merino; Óscar Fernández; Jens Kuhle; Claudio Gobbi; Amber Delmas; Steve Simpson-Yap; Nupur Nag; Bassem Yamout; Nina Steinemann; Pierrette Seeldrayers; Bénédicte Dubois; Ingrid van der Mei; Alexander Stahmann; Jelena Drulovic; Tatjana Pekmezovic; Waldemar Brola; Mar Tintore; Nynke Kalkers; Rumen Ivanov; Magd Zakaria; Maged Abdel Naseer; Wim Van Hecke; Nikolaos Grigoriadis; Marina Boziki; Adriana Carra; Mikolaj A Pawlak; Ruth Dobson; Kerstin Hellwig; Arlene Gallagher; Letizia Leocani; Gloria Dalla Costa; Nise Alessandra de Carvalho Sousa; Bart Van Wijmeersch; Liesbet M Peeters
Journal:  Int J MS Care       Date:  2021-12-29

3.  Clinically Assessed Walking Capacity Versus Real-World Walking Performance in People with Multiple Sclerosis.

Authors:  Kedar K V Mate; Nancy E Mayo
Journal:  Int J MS Care       Date:  2020-06-15

4.  Evaluation of neurological changes in secondary progressive multiple sclerosis patients treated with immune modulator MIS416: results from a feasibility study.

Authors:  Gill A Webster; Dalice A Sim; Anne C La Flamme; Nancy E Mayo
Journal:  Pilot Feasibility Stud       Date:  2017-11-16

5.  Study protocol: improving cognition in people with progressive multiple sclerosis: a multi-arm, randomized, blinded, sham-controlled trial of cognitive rehabilitation and aerobic exercise (COGEx).

Authors:  Anthony Feinstein; Maria Pia Amato; Giampaolo Brichetto; Jeremy Chataway; Nancy Chiaravalloti; Ulrik Dalgas; John DeLuca; Peter Feys; Massimo Filippi; Jennifer Freeman; Cecilia Meza; Matilde Inglese; Robert W Motl; Maria Assunta Rocca; Brian M Sandroff; Amber Salter; Gary Cutter
Journal:  BMC Neurol       Date:  2020-05-22       Impact factor: 2.474

Review 6.  Disability Outcome Measures in Phase III Clinical Trials in Multiple Sclerosis.

Authors:  Bernard M J Uitdehaag
Journal:  CNS Drugs       Date:  2018-06       Impact factor: 5.749

7.  Novel MS vital sign: multi-sensor captures upper and lower limb dysfunction.

Authors:  Alireza Akhbardeh; Jennifer K Arjona; Kristen M Krysko; Bardia Nourbakhsh; Pierre Antoine Gourraud; Jennifer S Graves
Journal:  Ann Clin Transl Neurol       Date:  2020-02-26       Impact factor: 4.511

8.  Cognitive mediated eye movements during the SDMT reveal the challenges with processing speed faced by people with MS.

Authors:  Bennis Pavisian; Viral P Patel; Anthony Feinstein
Journal:  BMC Neurol       Date:  2019-12-26       Impact factor: 2.474

9.  Computerized neuropsychological assessment devices in multiple sclerosis: A systematic review.

Authors:  Curtis M Wojcik; Meghan Beier; Kathleen Costello; John DeLuca; Anthony Feinstein; Yael Goverover; Mark Gudesblatt; Michael Jaworski; Rosalind Kalb; Lori Kostich; Nicholas G LaRocca; Jonathan D Rodgers; Ralph Hb Benedict
Journal:  Mult Scler       Date:  2019-10-22       Impact factor: 6.312

10.  An exploratory analysis of the efficacy of ocrelizumab in patients with multiple sclerosis with increased disability.

Authors:  Jerry S Wolinsky; Natalie J Engmann; Jinglan Pei; Ashish Pradhan; Clyde Markowitz; Edward J Fox
Journal:  Mult Scler J Exp Transl Clin       Date:  2020-03-16
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.