PURPOSE: Because successful change implementation depends on organizational readiness for change, the authors developed and assessed the validity of a questionnaire, based on a theoretical model of organizational readiness for change, designed to measure, specifically, a medical school's organizational readiness for curriculum change (MORC). METHOD: In 2012, a panel of medical education experts judged and adapted a preliminary MORC questionnaire through a modified Delphi procedure. The authors administered the resulting questionnaire to medical school faculty involved in curriculum change and tested the psychometric properties using exploratory and confirmatory factor analysis, and generalizability analysis. RESULTS: The mean relevance score of the Delphi panel (n = 19) reached 4.2 on a five-point Likert-type scale (1 = not relevant and 5 = highly relevant) in the second round, meeting predefined criteria for completing the Delphi procedure. Faculty (n = 991) from 131 medical schools in 56 countries completed MORC. Exploratory factor analysis yielded three underlying factors-motivation, capability, and external pressure-in 12 subscales with 53 items. The scale structure suggested by exploratory factor analysis was confirmed by confirmatory factor analysis. Cronbach alpha ranged from 0.67 to 0.92 for the subscales. Generalizability analysis showed that the MORC results of 5 to 16 faculty members can reliably evaluate a school's organizational readiness for change. CONCLUSIONS: MORC is a valid, reliable questionnaire for measuring organizational readiness for curriculum change in medical schools. It can identify which elements in a change process require special attention so as to increase the chance of successful implementation.
PURPOSE: Because successful change implementation depends on organizational readiness for change, the authors developed and assessed the validity of a questionnaire, based on a theoretical model of organizational readiness for change, designed to measure, specifically, a medical school's organizational readiness for curriculum change (MORC). METHOD: In 2012, a panel of medical education experts judged and adapted a preliminary MORC questionnaire through a modified Delphi procedure. The authors administered the resulting questionnaire to medical school faculty involved in curriculum change and tested the psychometric properties using exploratory and confirmatory factor analysis, and generalizability analysis. RESULTS: The mean relevance score of the Delphi panel (n = 19) reached 4.2 on a five-point Likert-type scale (1 = not relevant and 5 = highly relevant) in the second round, meeting predefined criteria for completing the Delphi procedure. Faculty (n = 991) from 131 medical schools in 56 countries completed MORC. Exploratory factor analysis yielded three underlying factors-motivation, capability, and external pressure-in 12 subscales with 53 items. The scale structure suggested by exploratory factor analysis was confirmed by confirmatory factor analysis. Cronbach alpha ranged from 0.67 to 0.92 for the subscales. Generalizability analysis showed that the MORC results of 5 to 16 faculty members can reliably evaluate a school's organizational readiness for change. CONCLUSIONS: MORC is a valid, reliable questionnaire for measuring organizational readiness for curriculum change in medical schools. It can identify which elements in a change process require special attention so as to increase the chance of successful implementation.
Authors: Tamar Ginossar; Carolyn J Heckman; Deborah Cragun; Lisa M Quintiliani; Enola K Proctor; David A Chambers; Ted Skolarus; Ross C Brownson Journal: J Med Educ Curric Dev Date: 2018-04-04