Literature DB >> 9202638

Ward evaluations: should they be abandoned?

C J Kwolek1, M B Donnelly, D A Sloan, S N Birrell, W E Strodel, R W Schwartz.   

Abstract

Even in the era of the objective structured clinical examination (OSCE), the predominant method of resident evaluation is the faculty ward evaluation (WE), despite many concerns about its reliability. The aim of this study was to determine the value of the WE as a measurement of clinical competence in terms of both reliability and validity. In a one-year period, surgery faculty members evaluated 72 residents. An average of 7 faculty members evaluated each resident. The evaluation form contained 10 specific performance ratings and an overall evaluation. Inter-rater reliability of the overall performance ratings was calculated by using the intraclass correlation. Validity of the WE was evaluated in four ways. Inter-rater reliability of the overall performance rating was 0.82; the reliability of a single overall rating was 0.39. (1) A discriminant function analysis indicated that residents at advanced levels of training received more positive evaluations than residents at less advanced levels (P < 0.0001). (2) The overall rating was significantly correlated (r = 0.55, P < 0.0001) with the overall score of a concurrent OSCE. (3) A factor analysis showed high correlations among the items, indicating a lack of discrimination between the skills. (4) Overall ratings were insensitive to performance deficiencies. Only 1.3% of the ratings were unsatisfactory or marginal. The WE was sufficiently reliable to estimate the faculty's view of each resident. The fact that the ratings tended to differentiate residents by level of training and that ratings significantly correlated with the OSCE provides strong evidence of their validity. However, factor analysis indicated that the faculty members were making one global, undifferentiated judgment and that these ratings did not identify deficient performance skills. We conclude that ward evaluations have a place in the assessment of residents.

Mesh:

Year:  1997        PMID: 9202638     DOI: 10.1006/jsre.1997.5001

Source DB:  PubMed          Journal:  J Surg Res        ISSN: 0022-4804            Impact factor:   2.192


  3 in total

1.  Does medical students' clinical performance affect their actual performance during medical internship?

Authors:  Eui-Ryoung Han; Eun-Kyung Chung
Journal:  Singapore Med J       Date:  2015-10-16       Impact factor: 1.858

Review 2.  Surgical simulation: a current review.

Authors:  B Dunkin; G L Adrales; K Apelgren; J D Mellinger
Journal:  Surg Endosc       Date:  2006-12-16       Impact factor: 3.453

3.  Pediatric faculty and residents' perspectives on In-Training Evaluation Reports (ITERs).

Authors:  Rikin Patel; Anne Drover; Roger Chafe
Journal:  Can Med Educ J       Date:  2015-12-11
  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.