| Literature DB >> 23745065 |
George R Bergus1, Jerold C Woodhead, Clarence D Kreiter.
Abstract
INTRODUCTION: The Objective Structured Clinical Examination (OSCE) is widely used to assess the clinical performance of medical students. However, concerns related to cost, availability, and validity, have led educators to investigate alternatives to the OSCE. Some alternatives involve assessing students while they provide care to patients - the mini-CEX (mini-Clinical Evaluation Exercise) and the Long Case are examples. We investigated the psychometrics of systematically observed clinical encounters (SOCEs), in which physicians are supplemented by lay trained observers, as a means of assessing the clinical performances of medical students.Entities:
Keywords: clinical skills; medical education; performance assessment
Year: 2010 PMID: 23745065 PMCID: PMC3643132 DOI: 10.2147/AMEP.S12962
Source DB: PubMed Journal: Adv Med Educ Pract ISSN: 1179-7258
Figure 1A schematic of the two-step systematically observed clinical encounter (SOCE) process. The student first interviews and examines a patient while being observed by a standardized observer (SO). The student then presents their findings to a faculty physician who is supervising learners in the General Pediatrics Clinic.
Generalizability analysis using urGENOVA® for estimating the sources of skills score variance
| Facets | Degrees of freedom | Variance component | Percentage of variance |
|---|---|---|---|
| Student | 51 | 2.1886 | 30.5% |
| Encounter:student | 146 | 4.9985 | 69.5% |
Notes: Data were collected by observing 51 third-year medical students on the pediatrics clerkship at the University of Iowa carver college of Medicine during the 2006–2007 academic year
Predicted systematically observed clinical encounter (SOCE) score reliability as a function of the number of student– patient encounters completed by students
| Number of SOCE cases | Score reliability |
|---|---|
| 2 | 0.467 |
| 3 | 0.568 |
| 4 | 0.637 |
| 5 | 0.686 |
| 6 | 0.724 |
| 7 | 0.754 |
| 8 | 0.778 |
| 9 | 0.798 |
| 10 | 0.814 |
| 11 | 0.828 |
Notes: The generalizabilty study was undertaken with urGenova® using all available data collected by observing 51 third-year medical students on the pediatrics clerkship at the University of Iowa carver college of Medicine during the 2006–2007 academic year
Clinical skills rating scale completed by faculty physicians after each SOCE. This instrument has six rating scales about the student’s data collection and clinical reasoning skills. A student’s score can range between 6 and 30 points
| Domain scores | |||||
|---|---|---|---|---|---|
| Few | Some | Many | Most | All | |
| Did the student present all of the major facts of the HPI? | |||||
| Little | Limited | Partial | Consistent | Complete | |
| Did the student develop the HPI by focus on onset of symptoms, site, time course, severity, setting, and aggravating-relieving factors? | |||||
| Little | Some | Many | Most | All | |
| Did the student provide other history (past medical history, family history, social history, ROS) appropriate to the presenting problem? | |||||
| Large amount | Some | Little | Rare | None | |
| Did the student present any | |||||
| Little | Some | Many | Most | All | |
| Did the presentation include appropriate detail about the important physical findings? | |||||
| Strongly disagree | Disagree | Neutral | Agree | Strongly agree | |
| Development of Diff DX and problem list demonstrated sound clinical reasoning. | |||||
Abbreviations: Diff DX, differential diagnosis; HPI, history of the present illness; PE, physical exam; ROS, review of systems; SOCE, systematically observed clinical encounter.