Literature DB >> 11971675

Does the subjective evaluation of medical student surgical knowledge correlate with written and oral exam performance?

S S Awad1, K R Liscum, N Aoki, S H Awad, D H Berger.   

Abstract

BACKGROUND: Medical student performance evaluations have historically contained a significant subjective component. Multiple tools are used to assess fund of knowledge including subjective evaluation by faculty and residents as well as objective evaluations through standardized written and oral exams. We hypothesized that subjective evaluation of medical student knowledge would correlate with objective evaluation through written and oral exams.
METHODS: Records of consecutive medical students assigned to the surgery clerkship from January 1999 and March 2001 were reviewed. The core surgical rotation consisted of two 4-week blocks on a private, county, or VA hospital service. Surgical knowledge was assessed subjectively by both faculty (FES) and senior residents (RES) using a 10-point scale with verbal anchors. Objective measures of student surgical knowledge included the National Board shelf exam (WE) and a semistructured oral exam (OE). Data are reported as mean +/- SEM. Spearman rank correlation coefficient (r) was used to assess relationships between groups (r > or = 0.5 --> positive correlation).
RESULTS: A total of 354 students were evaluated. The mean FES was 7.8 +/- 0.05 (median = 7.75, range 4.75 to 9.75). The mean RES was 7.7 +/- 0.06 (median = 8.0, range 3.5 to 10.0). There was poor correlation between the subjective perception and objective measures of surgical knowledge (Table 1). Comparison of the FES and RES also showed poor correlation (r = 0.38).
CONCLUSIONS: Subjective evaluation of surgical knowledge by faculty and residents correlates poorly with performance measured objectively. These results question whether subjective evaluation of surgical knowledge should be included as part of the evaluation process.

Entities:  

Mesh:

Year:  2002        PMID: 11971675     DOI: 10.1006/jsre.2002.6401

Source DB:  PubMed          Journal:  J Surg Res        ISSN: 0022-4804            Impact factor:   2.192


  4 in total

1.  Automated video-based assessment of surgical skills for training and evaluation in medical schools.

Authors:  Aneeq Zia; Yachna Sharma; Vinay Bettadapura; Eric L Sarin; Thomas Ploetz; Mark A Clements; Irfan Essa
Journal:  Int J Comput Assist Radiol Surg       Date:  2016-08-27       Impact factor: 2.924

2.  Predictors of medical school clerkship performance: a multispecialty longitudinal analysis of standardized examination scores and clinical assessments.

Authors:  Petra M Casey; Brian A Palmer; Geoffrey B Thompson; Torrey A Laack; Matthew R Thomas; Martha F Hartz; Jani R Jensen; Benjamin J Sandefur; Julie E Hammack; Jerry W Swanson; Robert D Sheeler; Joseph P Grande
Journal:  BMC Med Educ       Date:  2016-04-27       Impact factor: 2.463

3.  Pediatric faculty and residents' perspectives on In-Training Evaluation Reports (ITERs).

Authors:  Rikin Patel; Anne Drover; Roger Chafe
Journal:  Can Med Educ J       Date:  2015-12-11

4.  Faculty perspectives on the use of standardized versus non-standardized oral examinations to assess medical students.

Authors:  Natasha Johnson; Holly Khachadoorian-Elia; Celeste Royce; Carey York-Best; Katharyn Atkins; Xiaodong P Chen; Andrea Pelletier
Journal:  Int J Med Educ       Date:  2018-09-29
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.