PROBLEM STATEMENT AND PURPOSE: The lack of direct observation by faculty may affect meaningful judgments of clinical competence. The purpose of this study was to explore the influence of direct observation on reliability and validity evidence for family medicine clerkship ratings of clinical performance. METHOD: Preceptors rating family medicine clerks (n = 172) on a 16-item evaluation instrument noted the data-source for each rating: note review, case discussion, and/or direct observation. Mean data-source scores were computed and categorized as low, medium or high, with the high-score group including the most direct observation. Analyses examined the influence of data-source on interrater agreement and associations between clerkship clinical scores (CCS) and scores from the National Board of Medical Examiners (NBME(R)) subject examination as well as a fourth-year standardized patient-based clinical competence examination (M4CCE). RESULTS: Interrater reliability increased as a function of data-source; for the low, medium, and high groups, intraclass correlation coefficients were.29,.50, and.74, respectively. For the high-score group, there were significant positive correlations between CCS and NBME score (r =.311, p =.054); and between CCS and M4CCE (r =.423, p =.009). CONCLUSION: Reliability and validity evidence for clinical competence is enhanced when more direct observation is included as a basis for clerkship ratings.
PROBLEM STATEMENT AND PURPOSE: The lack of direct observation by faculty may affect meaningful judgments of clinical competence. The purpose of this study was to explore the influence of direct observation on reliability and validity evidence for family medicine clerkship ratings of clinical performance. METHOD: Preceptors rating family medicine clerks (n = 172) on a 16-item evaluation instrument noted the data-source for each rating: note review, case discussion, and/or direct observation. Mean data-source scores were computed and categorized as low, medium or high, with the high-score group including the most direct observation. Analyses examined the influence of data-source on interrater agreement and associations between clerkship clinical scores (CCS) and scores from the National Board of Medical Examiners (NBME(R)) subject examination as well as a fourth-year standardized patient-based clinical competence examination (M4CCE). RESULTS: Interrater reliability increased as a function of data-source; for the low, medium, and high groups, intraclass correlation coefficients were.29,.50, and.74, respectively. For the high-score group, there were significant positive correlations between CCS and NBME score (r =.311, p =.054); and between CCS and M4CCE (r =.423, p =.009). CONCLUSION: Reliability and validity evidence for clinical competence is enhanced when more direct observation is included as a basis for clerkship ratings.
Authors: Garrick Mok; Nicholas Schouela; Lisa Thurgur; Michael Ho; Andrew K Hall; Caudle Jaelyn; Hans Rosenberg; Shahbaz Syed Journal: CJEM Date: 2020-06-29 Impact factor: 2.410
Authors: Noëlle Junod Perron; Martine Louis-Simonet; Bernard Cerutti; Eva Pfarrwaller; Johanna Sommer; Mathieu Nendaz Journal: Med Educ Online Date: 2016-11-08
Authors: Noelle Junod Perron; Mathieu Nendaz; Martine Louis-Simonet; Johanna Sommer; Anne Gut; Bernard Cerutti; Cees P van der Vleuten; Diana Dolmans Journal: BMC Med Educ Date: 2014-04-14 Impact factor: 2.463