| Literature DB >> 28589073 |
Brent Thoma1, Michael Gottlieb2, Megan Boysen-Osborn3, Andrew King4, Antonia Quinn5, Sara Krzyzaniak6, Nicolas Pineda7, Lalena M Yarris8, Teresa Chan9.
Abstract
The evaluation of educational programs has become an expected part of medical education. At some point, all medical educators will need to critically evaluate the programs that they deliver. However, the evaluation of educational programs requires a very different skillset than teaching. In this article, we aim to identify and summarize key papers that would be helpful for faculty members interested in exploring program evaluation. In November of 2016, the 2015-2016 Academic life in emergency medicine (ALiEM) Faculty Incubator program highlighted key papers in a discussion of program evaluation. This list of papers was augmented with suggestions by guest experts and by an open call on Twitter. This resulted in a list of 30 papers on program evaluation. Our authorship group then engaged in a process akin to a Delphi study to build consensus on the most important papers about program evaluation for medical education faculty. We present our group's top five most highly rated papers on program evaluation. We also summarize these papers with respect to their relevance to junior medical education faculty members and faculty developers. Program evaluation is challenging. The described papers will be informative for junior faculty members as they aim to design literature-informed evaluations for their educational programs.Entities:
Keywords: curated collection; medical education; program evaluation
Year: 2017 PMID: 28589073 PMCID: PMC5453746 DOI: 10.7759/cureus.1224
Source DB: PubMed Journal: Cureus ISSN: 2168-8184
Figure 1Tweet by Brent Thoma soliciting requests for key papers on program evaluation in medical education
The complete list of study design literature reviewed by the authorship team and the ratings following each round of evaluation
| Article Title | Round 1: Mean rating (SD) | Round 2: % of raters that endorsed this paper | Round 3: % of raters that endorsed this paper | Top 5 Papers |
|
Twelve tips for evaluating educational programs [ | 6.8 (0.4) | 100% | 100% | 1st (tie) |
|
Program evaluation models and related theories: AMEE Guide No. 67 [ | 6.7 (0.5) | 100% | 100% | 1st (tie) |
|
AMEE Education Guide no. 29: evaluating educational programmes [ | 6.2 (1.4) | 88.9% | 100% | 1st (tie) |
|
Rethinking program evaluation in health professions education: beyond 'did it work'? [ | 6.0 (1.0) | 100% | 88.9% | 4th |
|
Perspective: Reconsidering the focus on "outcomes research" in medical education: a cautionary note [ | 5.7 (1.2) | 77.8% | 77.8% | 5th |
|
A conceptual model for program evaluation in graduate medical education [ | 5.9 (1.1) | 55.6 | 0% | |
|
Evaluating technology-enhanced learning: A comprehensive framework [ | 5.8 (1.3) | 66.7 | 22.2% | |
|
The structure of program evaluation: an approach for evaluating a course, clerkship, or components of a residency or fellowship training program [ | 5.6 (0.9) | 44.4% | 0% | |
|
AM last page: A snapshot of three common program evaluation approaches for medical education [ | 5.6 (1.0) | 55.6 | 0% | |
|
Using an outcomes-logic-model approach to evaluating a faculty development program for medical educators [ | 5.0 (1.2) | 22.2% | 0% | |
|
Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities [ | 5.0 (1.7) | 44.4% | 0% | |
|
Diseases of the curriculum [ | 4.9 (1.7) | 55.6% | 11.1% | |
|
Nimble approaches to curriculum evaluation in graduate medical education [ | 4.7 (1.0) | 22.2% | 0% | |
|
12 Tips for programmatic assessment [ | 4.7 (2.2) | 55.6% | 0% | |
|
A model to begin to use clinical outcomes in medical education [ | 4.4 (1.4) | 22.2% | 0% | |
|
Meta-analysis of faculty's teaching effectiveness: Student evaluation of teaching ratings and student learning are not related [ | 4.4 (1.9) | 0% | 0% | |
|
Transforming the academic faculty perspective in graduate medical education to better align educational and clinical outcomes [ | 4.3 (1.4) | 0% | 0% | |
|
How we conduct ongoing programmatic evaluation of our medical education curriculum [ | 4.2 (1.1) | 11.1% | 0% | |
|
Using a modified nominal group technique as a curriculum evaluation tool [ | 4.1 (1.3) | 11.1% | 0% | |
|
A new framework for designing programs of assessment [ | 4.1 (1.7) | 0% | 0% | |
|
Evaluation of a collaborative program on smoking cessation: Translating outcomes framework into practice [ | 3.9 (1.5) | 11.1% | 0% | |
|
The role of theory-based outcome frameworks in program evaluation: Considering the case of contribution analysis [ | 3.9 (1.8) | 0% | 0% | |
|
Use of an institutional template for annual program evaluation and improvement: benefits for program participation and performance [ | 3.4 (1.4) | 0% | 0% | |
|
Instructional effectiveness of college teachers as judged by teachers, current and former students, colleagues, administrators, and external (neutral) observers [ | 3.3 (1.7) | 0% | 0% | |
|
Student evaluations of teaching (mostly) do not measure teaching effectiveness [ | 3.1 (1.8) | 11.1% | 0% | |
|
How we use patient encounter data for reflective learning in family medicine training [ | 3.0 (1.0) | 0% | 0% | |
|
Half a minute: Predicting teacher evaluations from thin slices of nonverbal behavior and physical attractiveness [ | 2.9 (1.6) | 0% | 0% | |
|
Experimental study design and grant writing in eight steps and 28 questions [ | 2.6 (1.8) | 0% | 0% | |
|
Early experience of a virtual journal club [ | 2.4 (1.6) | 0% | 0% | |
|
Cost: The missing outcome in simulation-based medical education research: A systematic review [ | 2.3 (1.0) | 0% | 0% |