| Literature DB >> 28344722 |
Isabelle N Colmers-Gray1, Kieran Walsh2, Teresa M Chan3.
Abstract
BACKGROUND: Competency-based medical education is becoming the new standard for residency programs, including Emergency Medicine (EM). To inform programmatic restructuring, guide resources and identify gaps in publication, we reviewed the published literature on types and frequency of resident assessment.Entities:
Year: 2017 PMID: 28344722 PMCID: PMC5344063
Source DB: PubMed Journal: Can Med Educ J
Figure 1Study selection
Characteristics of included papers
| Characteristic | Studies reporting (N=73) | Median (IQR) |
|---|---|---|
| 73 | ||
|
| ||
| USA | 59 (81%) | |
| Canada | 10 (14%) | |
| Europe | 2 (3%) | |
| Australia | 2(3%) | |
| New Zealand | 0 (0%) | |
|
| ||
| 73 | ||
|
| ||
| Pilot project | 40 (55%) | |
| Correlation/validation | 9 (12%) | |
| Comparative | 57%) | |
| Assessment programs | 19 (26%) | |
|
| ||
| 65 | 3 (IQR: 3–4) | |
|
| ||
| 3 years | 44 (59%) | |
| 4 years | 15 (20%) | |
| 5 years | 15 (20%) | |
|
| ||
| 55 | ||
|
| ||
| in program | 24 (41%) | 37 (IQR: 30–49) |
| in study | 31 (56%) | 30 (IQR: 15–52) |
|
| ||
| 73 | 1 (IQR: 1–2) | |
|
| ||
| 0 | 8 (11%) | |
| 1 | 30 (41%) | |
| 2 | 23 (32%) | |
| ≥3 | 12 (16%) | |
65 publications comprising 74 programs
55 publications comprising 59 programs
Number of construct validity levels demonstrated for each assessment tool (based on 6 criteria of the Messick Framework for global construct validity)
Figure 2Median number of assessment of residents by time interval reported for each tool
| Study (author, year) | Location | # Residents or participants | Program duration (ys) | Type of tool | Brief study description | Total #tools | Tool type(s) used | Assessment Frequency (time period:#) | Cost reporting present? | Messick criteria demonstrated |
|---|---|---|---|---|---|---|---|---|---|---|
| Akhtar, 2010 | USA | Participants (EM): 42 | 3 | Impact Assessment | Post-PICU rotation exam in EM & Pediatrics residents | 1 | Written/standardized exam | Rotation: 1 | No | 2 |
| Beeson, 2006 | USA | “Variable” | - | Tool description | Development of national EM question bank & exam in US | 1 | Written/standardized exam | Ever: “Multiple times” | No | 3 |
| Burnette, 2009 | USA | PGY1 (37), PGY2 (42), PGY3 (16) | 3 | Curriculum description | Implementation & impact of online PEM curriculum on pre/post curriculum test scores | 1 | Written/standardized exam | - | No | 1, 2, 3, 5 |
| Clark, 2010 | Canada (Vancouver) | - | 5 | Curriculum description | Evaluation of high fidelity simulation program | 1 | Simulation | - | No | 3, 4, 6 |
| Cooper, 2012 | USA (Indiana) | Participants: 76 | 3 | Correlation study | Correlation between self, peer and faculty assessments of leading simulation cases | 1 | 360-degree/multisource feedback | Monthly: 2 | No | 1, 2 |
| Dorfsma n, 2009 | USA (Pittsburgh) | Participants: PGY1 (3), PGY2 (28), PGY3 (1) | 3 | Curriculum description | Implementation of SDOT program for EM residents | 1 | Direct observation (novel tool: adapted CORD-EM SDOT tool) | Ever: 1 (in PGY2) | No | 0 |
| Hauff, 2014 | USA (Michigan) | Total incoming PGY1: 28 | 4 | Tool description | Competency assessment of incoming interns in EM | 3 | Direct observation (novel tool: milestonebased clinical skills assessment tool), Simulation, Other (EM milestones global evaluation form) | Ever: 4 (in PGY1) | No | 2, 3, 4 |
| Ilgen, 2011 | USA (Boston) | Total PGY4: 15 | 4 | Curriculum description | Experience with ‘resident-asteacher’ curriculum (teaching senior role) | 2 | Direct observation (novel tool: based on residenttailored learning objectives using a ‘teach the teacher’ model), 360-degree/multisource feedback | Rotation: weekly | No | 2, 5 |
| Kassam, 2014 | Canada (Calgary) | - | 5 | tool development/validation | Retrospective description of items and validation of linking to CanMEDS | 1 | ITER/end of rotation assessment | Rotation: 1 | No | 1, 2, 6 |
| McIntosh, 2012 | USA (Jacksonville, FL) | - | 3 | Curriculum description | Development and assessment of international EM curriculum | 2 | Oral/verbal exam, Reflective portfolio | Rotation: 2 | No | 1, 2 |
| Motov, 2011 | USA (Brooklyn, NY) | - | 3 | Curriculum description | Pain management curriculum | 4 | Written/standardized exam, OSAT, Others (pre-and posttests; customized SDOT-PAIN scale) | Rotation: 2 weekly | No | 0 |
| Noeller, 2008 | USA | All residents: 38 | 3 | Curriculum description | Testing & evaluation of a theme based hybrid simulation model | 2 | Written/standardized exam, Simulation | Rotation: 2 | No | 4, 5 |
| Pavlic, 2014 | USA (U of Michigan, Ann Arbor) | - | 4 | Curriculum description | Retrospective study of nursing feedback to residents | 1 | 360-degree/multisource feedback | - | No | 1, 2 |
| Ryan, 2010 | USA (New York Hospital Queens, Flushing NY) | All residents: 30 (10 per year) | 3 | Curriculum description | 4-year observational study of direct observation vs. quarterly evaluations | 2 | Direct observation (novel tool: assessment of competencies during a single patient encounter), ITER/end of rotation assessment (same tool but globally applied) | Quarterly: 21 | No | 1, 2, 4 |
| Sampsel, 2014 | Canada (Ottawa) | All residents: 45 | 5 | Curriculum description | Clinical Teaching Team program development and implementation | 4 | Oral/verbal exam, Direct observation (novel tool: “direct observation”), Daily encounter cards, Other (targeted clinical encounters) | Rotation: 1/3 of shifts | No | 1, 2, 5 |
| Shih, 2013 | USA | Total PGY1 residents over 5 years: 36 (avg 7 per year) | 3 | Correlation study | Correlation between faculty ratings and OSCE exam scores | 1 | OSCE | - | No | 1, 2, 3 |
| Sullivan, 2009 | USA | PGY1 (10) | 3 | Curriculum description | Introduction and development of a communication curriculum | 2 | Direct observation (novel tool: communication skills checklist), Other (videotape-facilitated self assessment) | - | No | 1, 5 |
| Wagner, 2013 | USA (Michigan) | - | 3 | Curriculum description | Use of a standard form to assess milestones during EM1 orientation sessions | 1 | OSAT | Annually: 1 | No | 1, 6 |
| Wallenstein, 2010 | USA (Atlanta) | All PGY1: 18 | 3 | Correlation study | Ability of early OSCE to predict ACGME core competency scores | 3 | OSCE, Direct observation (mini CEX), Direct observation (SDOT) | Annually: 1 | No | 1, 2 |
= abstract only
Messick levels of validity evidence coding: 0 = None met (and no alternative paradigm used); 1 = Structural validity; 2 = Content validity; 3 = Substantive validity; 4 = External validity; 5 = Generalizability validity; 6 = Consequential validity; 0 = none reported
Abbreviations: ACGME = American Council of Graduate Medical Education; CORD-EM = Council Of Emergency Medicine Residency Directors; EM = Emergency Medicine; ITER = In-Training Evaluation Report; Mini-CEX = Mini-Clinical Evaluation Exercise; OSAT = Objective Structured Assessment of Technical skills; OSCE = Objective Structured Clinical Exam; PGY = Post-Graduate Year (i.e., residency year); SDOT = Standardized Direct Observation Tool