Literature DB >> 1343207

How well do internal medicine faculty members evaluate the clinical skills of residents?

G L Noel1, J E Herbers, M P Caplow, G S Cooper, L N Pangaro, J Harvey.   

Abstract

OBJECTIVE: To determine the accuracy of faculty evaluations of residents' clinical skills and whether a structured form and instructional videotape improve accuracy.
DESIGN: Randomized, controlled trial.
SETTING: Twelve university and community teaching hospitals. PARTICIPANTS: A total of 203 faculty internists.
INTERVENTIONS: Participants watched a videotape of one of two residents performing new patient workups. Participants were assigned to one of three groups: They used either an open-ended evaluation form or a structured form that prompted detailed observations; some participants used the structured form after seeing a videotape showing good evaluation techniques. MAIN OUTCOME MEASURES: Faculty observations of strengths and weaknesses in the residents' performance were scored. An accuracy score consisting of clinical skills of critical importance for a competent history and physical examination was calculated for each participant by raters blinded to the participants' hospital, training, subspecialty, and experience as observers.
RESULTS: When observations were not prompted, participants recorded only 30% of the residents' strengths and weaknesses; accuracy among participants using structured forms increased to 60% or greater. Faculty in university hospitals were more accurate than those in community hospitals, and general internists were more accurate than subspecialists; the structured form improved performance in all groups. However, participants disagreed markedly about the residents' overall clinical competence: Thirty-one percent assessed one resident's clinical skills as unsatisfactory or marginal, whereas 69% assessed them as satisfactory or superior; 48% assessed the other resident's clinical skills as unsatisfactory or marginal, whereas 52% assessed them as satisfactory or superior. Participants also disagreed about the residents' humanistic qualities. The instructional videotape did not improve accuracy.
CONCLUSIONS: A structured form improved the accuracy of observations of clinical skills, but faculty still disagreed in their assessments of clinical competence. If program directors are to certify residents' clinical competence, better and more standardized evaluation is needed.

Entities:  

Mesh:

Year:  1992        PMID: 1343207     DOI: 10.7326/0003-4819-117-9-757

Source DB:  PubMed          Journal:  Ann Intern Med        ISSN: 0003-4819            Impact factor:   25.391


  33 in total

1.  A picture is worth a thousand words: practical use of videotape in teaching.

Authors:  L E Pinsky; J E Wipf
Journal:  J Gen Intern Med       Date:  2000-11       Impact factor: 5.128

2.  Evaluation of residents: challenges and opportunities.

Authors:  T G Cooney
Journal:  J Gen Intern Med       Date:  2001-07       Impact factor: 5.128

Review 3.  The death of the long case?

Authors:  John J Norcini
Journal:  BMJ       Date:  2002-02-16

4.  Confidential testing of cardiac examination competency in cardiology and noncardiology faculty and trainees: a multicenter study.

Authors:  Jasminka M Vukanovic-Criley; Arsen Hovanesyan; Stuart Ross Criley; Thomas J Ryan; Gary Plotnick; Keith Mankowitz; C Richard Conti; John Michael Criley
Journal:  Clin Cardiol       Date:  2010-12       Impact factor: 2.882

5.  [Implementation of a competency-based graduate medical education program in a neurology department].

Authors:  S Meyring; H-C Leopold; M Siebolds
Journal:  Nervenarzt       Date:  2006-04       Impact factor: 1.214

6.  A randomized-controlled study of encounter cards to improve oral case presentation skills of medical students.

Authors:  Sarang Kim; Jennifer R Kogan; Lisa M Bellini; Judy A Shea
Journal:  J Gen Intern Med       Date:  2005-08       Impact factor: 5.128

7.  Education research: Bias and poor interrater reliability in evaluating the neurology clinical skills examination.

Authors:  L A Schuh; Z London; R Neel; C Brock; B M Kissela; L Schultz; D J Gelb
Journal:  Neurology       Date:  2009-07-15       Impact factor: 9.910

8.  Didactic value of the clinical evaluation exercise. Missed opportunities.

Authors:  F J Kroboth; B H Hanusa; S C Parker
Journal:  J Gen Intern Med       Date:  1996-09       Impact factor: 5.128

9.  Towards an Operational Definition of Clinical Competency in Pharmacy.

Authors:  L Douglas Ried; Charles A Douglas
Journal:  Am J Pharm Educ       Date:  2015-05-25       Impact factor: 2.047

10.  The implementation of a mobile problem-specific electronic CEX for assessing directly observed student-patient encounters.

Authors:  Gary S Ferenchick; Jami Foreback; Basim Towfiq; Kevin Kavanaugh; David Solomon; Asad Mohmand
Journal:  Med Educ Online       Date:  2010-01-29
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.