Literature DB >> 30800934

An Interactive Quality Improvement and Patient Safety Workshop for First-Year Medical Students.

Luba Dumenco1, Kristina Monteiro2, Paul George1,3, Lynn McNicoll4, Sarita Warrier1, Richard Dollase5.   

Abstract

Introduction: There is a call to incorporate quality improvement and patient safety (QI/PS) content into undergraduate medical education, though limited literature exists on optimal teaching strategies. We designed a required, interactive workshop for first-year medical students to introduce principles of QI/PS, specifically focusing on student attitudes, knowledge, and skills.
Methods: We used active learning principles from existing literature and included the application of QI/PS concepts, engaging in PDSA (plan, do, study, act) cycles, conducting root cause analyses, and creating a fishbone diagram. Evaluation of student knowledge included pre/post assessments with locally designed multiple-choice items and a case scenario from the Quality Improvement Knowledge Application Tool. Additional students' self-assessments included perceived knowledge and problem-solving skills. We also evaluated student satisfaction with the workshop.
Results: Results on the direct assessment total score (n = 136) indicated significant growth from pretest (65%) to posttest (89%). Indirect assessments (n = 138) targeting perceived ability to define QI/PS principles, identify key components in a QI case scenario, explain the purpose of a fishbone diagram, apply a PDSA cycle, and create a fishbone diagram for a QI case scenario all significantly increased from pre- to postworkshop. The mean overall rating across the 2 years the workshop was administered (ns = 134, 137) was 75% (i.e., good to very good). Discussion: First-year medical students' knowledge and perceived skills significantly increased from start to end of the workshop. The workshop was placed in an appropriate stage of the curriculum and contained relevant information for our learners.

Entities:  

Keywords:  Patient Safety; Quality Improvement

Mesh:

Year:  2018        PMID: 30800934      PMCID: PMC6342418          DOI: 10.15766/mep_2374-8265.10734

Source DB:  PubMed          Journal:  MedEdPORTAL        ISSN: 2374-8265


Educational Objectives

By the end of this workshop, learners will be able to: Identify active and latent causes of error for the cases presented. Determine whether a root cause analysis is appropriate for the case presented. Draw a fishbone diagram using the case study to (a) identify at least two potential causes of undesired outcomes and (b) describe the actions needed to address those potential causes in order to prevent recurrence. Reflect upon and share observations about systems factors that could contribute to a culture of safety in clinical sites and the role of the medical student in improving this system. Demonstrate the steps of a PDSA (plan, do, study, act) cycle using a Mr. Potato Head team-building activity. Describe how the Mr. Potato Head exercise can be translated to health care, including a focus on the importance of teamwork and leadership.

Introduction

There is wide consensus that content related to quality improvement and patient safety (QI/PS) must be incorporated into medical education. While graduate medical education includes QI/PS in the ACGME competency domains, the AAMC and World Health Organization call for more formal training of these concepts in undergraduate medical education.[1,2] The NBME has echoed the importance of teaching this content, as USMLE Step 1 and Step 2 Clinical Knowledge licensing exams both now include items incorporating QI/PS content, including improvement science principles, specific models of QI, quality measurement, and specific types of errors.[3] Several systematic reviews addressed training in QI/PS with learners ranging from medical students to residents over the 2000–2015 time period.[4-8] Not surprisingly, the majority of literature regarding teaching of QI/PS in medical school targets third- and fourth-year medical students in their clinical years rather than preclerkship students, whose curricula now likely incorporate early clinical experiences. While published research on QI/PS curricula for first- and second-year (preclerkship) medical students exists, there are many opportunities for improvement. Of the publications relating to preclerkship medical students, several included curricular descriptions but did not provide direct assessment data of student knowledge with pre- and postmeasures.[9-11] This lack of direct assessment data makes it difficult to determine the effects of a curriculum on student knowledge and skills. Furthermore, student satisfaction with early QI/PS curricula has been historically low. For example, one study that included a mandatory curriculum for preclinical students (n = 77) showed an increase in student knowledge, skills, and attitudes, but the level of learner satisfaction was poor.[12] Also of note, in some cases, the QI/PS curriculum published was an elective component of the curriculum and not integrated or mandatory for all students.[13] The QI/PS curriculum reported by Madigosky, Headrick, Nelson, Cox, and Anderson[14] is perhaps most aligned to the workshop described here as it features lectures, panel discussions, demonstrations, and role-playing. This QI/PS curriculum included all medical students and was integrated into the core curriculum at the end of year 2. Results indicated growth in student knowledge and a high degree of student satisfaction, though results were not all sustained over time. Furthermore, some gains were reversed, a situation that was hypothesized to be a result of informal curriculum experiences in clinical clerkships. A review of publications in MedEdPORTAL yielded similar results. Much the published material regarding QI/PS training has focused on resident learners,[15,16] while those resources targeting undergraduate medical students were primarily elective and not required of the entire class.[17] A QI/PS simulation by Worsham, Swamy, Gilad, and Abbott[18] has been used in undergraduate medical education, but student performance on a knowledge-based test was not found to significantly increase from pre- to posttest. Moreover, while the methods of the study indicated the simulation was used with undergraduate medical students, it was unclear if this was during their preclerkship or clinical years. As previously stated, there are studies focusing on undergraduate medical students in their clinical years[19]; however, there remains a paucity of literature regarding QI/PS curricula focused on required curricular components for students in their early preclerkship years that have demonstrated significant growth on direct assessments of student knowledge pre- and posttest. In addition, evidence is lacking as to the optimal educational format, the amount of curricular time that should be devoted to these topics in light of ever-increasing curricular time constraints, and how early QI/PS training should begin. To address the need for a required QI/PS training for an entire class of undergraduate medical students, we created an innovative workshop. The workshop presented here features active learning components that provide students with opportunities to apply and practice skills based on the principles that they have learned in self-study Institute for Healthcare Improvement (IHI) modules.

Educational Theory

No single approach to teaching QI/PS is superior in improving learning outcomes.[7,8] Therefore, we relied upon adult learning theory principles, including problem-oriented, goal-focused, experiential, and collaborative active learning techniques.[20] Madigosky and colleagues[14] suggested that application-focused learning and QI/PS case-based interactive sessions resulted in greater preclinical student satisfaction and more long-lasting impact on students' knowledge, skills, and attitudes. Similarly, discussions of real-life mistakes are preferred by medical students learning about PS.[21] To date, the literature on optimal teaching strategies for early undergraduate medical education learners is very limited. The QI/PS workshop described here is part of a longitudinal curriculum being developed at the Warren Alpert Medical School of Brown University. It is designed to address the need for introduction of QI/PS teaching into early undergraduate medical education (and thus targets preclerkship medical students) as well as the need to identify effective, evidence-based teaching practices in QI/PS. We sought to contribute to the existing literature by creating a required, interactive workshop focusing on building foundational QI/PS knowledge and providing opportunities for our first-year medical students to practice new skills. Each of the sessions included interactive components and small-group learning opportunities. Additionally, the case-study session approach was incorporated into one of the workshop sessions. To address the gap in the literature on assessment methods, this workshop was evaluated using a multimodal approach. We included direct assessment of student knowledge using a locally developed multiple-choice content section as well as a portion of a validated assessment tool, the revised Quality Improvement Knowledge Application Tool (QIKAT-R),[22] indirect assessment of students' perceived skills, and a curriculum evaluation. Together, these assessments used a mixed-methods, quantitative and qualitative approach. Note that although we targeted first-year learners in order to provide QI/PS foundational knowledge early in training, we see no reason that medical students in subsequent years could not also benefit from the workshop.

Methods

In the fall semester of the first year of medical school, our students completed four IHI modules: Patient Safety 100 and 101 and Quality Improvement 101 and 102.[23] These were chosen to provide a basic foundation in QI/PS. The IHI modules were required, and we allowed students to complete them at their own pace between late October and mid-December. The modules were not associated with a particular course but rather with a thread of QI/PS content running throughout multiple years of our curriculum. The time elapsed between the completion of the IHI modules and the workshop was a function of the optimal timing we had available in our curriculum. We also acknowledge the potential advantages of spaced repetition with this approach. The interactive QI/PS workshop was held in the spring semester of the first year and was preceded by a 50-minute lecture by an expert in the field, in our case, the President and CEO of the Rhode Island Quality Institute.[24] The lecture served as a review of concepts students had been exposed to in the previous semester through IHI modules and additionally focused on local application of QI/PS initiatives in the state of Rhode Island. While the lecture was engaging and informative for our students, its specificity to our state limits its generalizability to other institutions. Importantly, the pre- and posttest design allowed us to measure the effect of the workshop alone, as students completed the baseline pretest after the lecture but prior to the workshop. The Internal Review Board at Brown University stated that this project was not considered human subjects research and did not require review.

Workshop Overview

The workshop consists of four major components: (1) a brief overview lecture, (2) a 90-minute Mr. Potato Head activity designed as a hands-on exercise to use PDSA (plan, do, study, act) cycles and run charts (based on an exercise developed by Williams[25] and used in teaching QI at several institutions[26,27]), (3) a 45-minute case discussion session using real-life cases from a local hospital to practice identifying active/latent errors and creating fishbone diagrams, and (4) a 45-minute small-group discussion session with fourth-year medical students.

Introduction and Logistics

Two faculty members presented a brief 15- to 20-minute overview lecture (Appendix A) to the entire class that reviewed concepts from the IHI modules students had completed in the fall semester. The class was then divided, with half of our students completing the Mr. Potato Head activity first and the other half completing the case discussion activity first. In the second year the workshop was offered, the time allotted for the case discussion activity was cut in half based on student and faculty feedback, and the remaining time was spent with fourth-year medical student small-group facilitators who discussed their QI/PS experiences in the clinical years. See Table 1 for details.
Table 1.

Workshop Logistics

ActivityTime AllottedMaterials Needed
First OfferingSecond Offering
Overview lecture15–20 minutes15–20 minutesAppendix A
Mr. Potato Head activity90 minutes90 minutesAppendix BBags with dissembled Mr. Potato Heads (15 per bag)—Appendix CPhotographs of correctly assembled figures, markers for completing run chart, faculty guide—Appendix D
Case discussion activity90 minutes45 minutesAppendices E, F
Discussion with fourth-year medical studentsa45 minutesAppendix G

The discussion with fourth-year medical students occurred only in the second offering.

The discussion with fourth-year medical students occurred only in the second offering.

Mr. Potato Head Activity Overview and Logistics

Within the Mr. Potato Head activity and the case discussion, two sessions of each ran simultaneously, for a total of four sessions occurring at one time (two Mr. Potato Head activities and two case discussions). The two simultaneous Mr. Potato Head activity sessions were facilitated by two faculty members with QI/PS expertise; the two simultaneous case discussions were led by a team of two faculty members (also with QI/PS experience and training) per session. Each session took place in a large room (our “Academies,” which serve as student lounge areas with tables and chairs). In the 90-minute Mr. Potato Head activity, students were divided into groups of five to six, and each team gathered around individual tables with a duffle bag containing disassembled pieces from 15 different Mr. Potato Heads and photographs of the figures correctly assembled (Appendix B). We had 14 available duffle bags, each with unique Mr. Potato Head figures. Prototypes can be located in Appendix B. We provided students with a run chart template (Appendix C) and markers. A faculty guide is also included (Appendix D).

Case Discussion Session and Logistics

In the 45-minute case discussion session, students were divided into groups of five to six in one of our case conference rooms and presented with a brief description of a real-life PS case from a local hospital (Appendix E). Facilitators for all sessions were faculty members with experience and training in QI/PS; they used the computer in the room to present the case. We presented students with two local cases during the first iteration of the workshop. However, in April 2017, the IHI Newsletter included a reading on “Lessons From an MRI Machine Gone Rogue.”[28,29] We used this case in addition to the local case in the second iteration of the workshop, and students found it to elicit more conversation than the secondary local case. Therefore, future workshop iterations will include the local case provided in Appendix E and the case from the article cited above. After the presentation of each case, students worked in small groups and identified active and latent causes of error in the case, assessed whether a root cause analysis was appropriate to the case presented, and created a fishbone diagram (Appendix F). After approximately 8–10 minutes discussing in small groups, the facilitators of the session brought the groups together to share their findings with the larger group. At the end of the session, students reflected and shared observations about systems factors that could contribute to a culture of safety in their clinical sites (our students work weekly with community mentor physicians as part of our clinical skills course during the first 2 years of medical school). They also specifically discussed the role of a medical student in this system.

Discussion With Fourth-Year Medical Students

In the second year we offered the workshop, we slightly altered the format by reducing the time allotted for the case discussion and adding a 45-minute session for students to meet in small groups of five to six with a fourth-year medical student in a small-group seminar room to discuss some of the QI/PS initiatives, training, or situations they had encountered in the clinical years. The discussion topics with the fourth-year medical student were purposefully unstructured, but we asked the student facilitators to focus on their QI/PS experiences in their clinical years, including their observations about the usefulness of QI/PS training (the fourth-year students did not have formal training prior to clerkships, though they reported that they wished they had) and regarding any initiatives or mistakes they had witnessed, as well as the role of the medical student in PS. See Appendix G for the prompts we provided our students.

Assessment

Direct assessment of student knowledge was evaluated in a pre/post design (Appendices H [pretest] and I [posttest]) where students completed locally designed multiple-choice items targeting QI/PS basic information, short-answer questions on identifying errors, and a fishbone diagram based upon a case. These items were specifically designed to address our educational objectives. The pretest was administered in the hour prior to the workshop, and the posttest was administered 10 days after the workshop, both via paper and pencil. The pre- and posttests also included a component of the previously validated QIKAT-R.[22] This section of the assessment asked students to analyze a QIKAT-R case scenario and to identify an aim, choose an appropriate measure to determine if change had occurred, and propose a change. The QIKAT-R components of the assessment were graded using the published rubric[22] by three faculty members who worked together to ensure consensus in applying the rubric. Each student completed a single QIKAT-R case for at each assessment time point. Two different QIKAT-R cases were chosen (modified slightly for use with first-year medical students)—one for each half of the class for pretest, then reversing the cases for the posttest. This allowed us to control for the difficulty of the cases and to decrease testing bias. Indirect assessment of self-perceived attitudes and skills in QI/PS was also incorporated into the pre- and posttest assessments. The pre/post design was completed the first year the workshop was offered but was not repeated for the second offering as we had demonstrated curricular effectiveness. Student satisfaction with the workshop was assessed via a QI/PS curriculum evaluation (Appendix J) administered on our records and registration software. The evaluation was opened immediately following the workshop and remained open for 30 days. The curriculum evaluation was completed both years the workshop was offered.

Results

Our direct assessment analysis included students who attended the first iteration of the workshop and completed the pretest and the posttest (n = 136). Only students with complete data on the pretest and the posttest were included in the analyses. We matched student pretest with posttest and used paired t tests to identify mean differences. The mean total score on the pretest was 65.09 out of 100 (SD = 19.50), while the mean total score on the posttest was 89.48 out of 100 (SD = 6.32). This indicated significant growth from pretest to posttest (t135 = −14.76, p < .001) and also a reduction in the range or variance of scores, as indicated by the smaller standard deviation on the posttest. The assessments were targeted at Kirkpatrick's level 2 (knowledge).[30] Our indirect assessment analysis included students with complete data on both the pre- and postsurveys asking about perceived skills and attitudes on QI/PS (n = 138). We saw significant pre/post increases across all five locally developed subcategories on students' perceived abilities. These included the perceived ability to: Define QI/PS principles (t137 = −19.86, p < .001); Identify key components in a QI case scenario (t137 = −16.20, p < .001); Explain the purpose of a fishbone diagram (t137 = −24.37, p < .001); Apply a PDSA cycle (t137 = −16.06, p < .001); and Create a fishbone diagram for a QI case scenario (t137 = −26.31, p < .001). Students completed separate curriculum evaluations via OASIS, our records and registration software, targeting Kirkpatrick level 1 (learner satisfaction)[30] each year the workshop was offered (ns = 134, 137). Across the 2 years, the mean overall rating was 75% (SD = .89). Details for individual items are in Table 2. Generally, all items increased from the first iteration to the second iteration of the workshop.
Table 2.

Workshop Curriculum Evaluation 2016 and 2017 Results

Itema20162017
MSDMSD
1. The aims of this quality improvement/patient safety workshop were clear to me.4.050.714.310.69
2. This quality improvement/patient safety workshop dealt with knowledge/skills I needed to learn.4.050.734.340.75
3. I have improved my knowledge/skills as a result of this workshop.3.900.784.070.81
4. I acquired new knowledge/skills that will be of value during my career.3.840.824.100.84
5. The instructors facilitated my understanding.3.960.784.230.80
6. The time devoted to the topic was sufficient.3.840.853.910.97
7. This stage of the overall curriculum is an appropriate time for this particular topic.3.710.924.050.84
8. Organization of course materials, including prepared templates.3.530.873.910.95
9. Activity 1: quality improvement cases and fishbone diagrams.3.160.923.551.06
10. Activity 2: Mr. Potato Head and PDSA cycles.4.121.054.141.00
11. Activity 3: fourth-year student session.b3.671.14
12. Overall quality of quality improvement/patient safety workshop.3.650.903.830.88

Abbreviation: PDSA, plan, do, study, act.

Ratings of items 1–7 are on a 5-point scale where 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree. Ratings of items 8–12 are on a 5-point scale where 1 = poor, 2 = fair, 3 = good, 4 = very good, and 5 = outstanding.

Activity 3 was offered only in 2017.

Abbreviation: PDSA, plan, do, study, act. Ratings of items 1–7 are on a 5-point scale where 1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree, and 5 = strongly agree. Ratings of items 8–12 are on a 5-point scale where 1 = poor, 2 = fair, 3 = good, 4 = very good, and 5 = outstanding. Activity 3 was offered only in 2017.

Discussion

The required interactive QI/PS workshop for first-year medical students described here provided evidence of increased knowledge and perceived skills in QI/PS principles and incorporated sessions that included didactic, small-group discussion, and hands-on activity components. Our direct assessment indicated student achievement in our educational objectives. In the Mr. Potato Head activity, students demonstrated and described the steps in a PDSA cycle by creating run charts with the number of completed figures and the total number of errors per cycle. Students also had practice discussing what they would change between cycles. In small groups and then in the larger group, students discussed how this exercise could be translated to the health care arena, focusing primarily on teamwork and leadership. In the case discussion session, students identified active and latent causes of error, determined when a root cause analysis was appropriate, and created a fishbone diagram for the presented case. Following this opportunity to practice their skills, there was discussion of systems factors that could contribute to a culture of safety in clinical sites, as well as of their role as a medical student in this system. Our indirect assessment data provided evidence that students' perceived skills in the areas of QI/PS significantly increased after the workshop. Our first-year students rated the workshop 75%, or a good to very good rating on our 5-point scale. Achieving ratings this favorable had been a challenge for existing curricula with early learners. We aim to further improve our QI/PS interactive workshop by making the cases more interactive and adjusting the length of time for each of the sessions. We anticipate shortening the Mr. Potato Head activity, since many of our prior students finished the activity but were not able to move to the case discussion, as that session had not yet finished. However, as evidenced by our curriculum evaluation, our students rated the Mr. Potato Head activity highly (4.14 out of 5.00). Thus, we will focus primarily on improving our logistics and organization versus removing the activity altogether. Limitations of our work include that we cannot comment on the effectiveness of the workshop without the foundation of the IHI modules; however, these are freely available for health care students, faculty, and administrators. In addition, we do not believe that one isolated QI/PS workshop is sufficient for providing students with all of the knowledge and skills they need to incorporate these principles into practice. Thus, further content should be integrated throughout the remainder of undergraduate medical education and continue into graduate medical education. In addition, we examined only students' knowledge and skills (PDSA, fishbone diagram, and QIKAT-R) but did not assess changes in attitudes or impact on behavior. Lastly, our posttest was administered 10 days after the workshop had been completed, and therefore, we cannot certify retention of this information from the end of first year to the start of second year. A longitudinal evaluation of retention of this content would significantly contribute to the field. We used a locally designed pre/post direct assessment that can be implemented at other institutions. The case presented here came from a local hospital, and potentially identifying information has been removed. However, we do not believe the removal of this information detracts from the relevance of the case. Alternatively, locally designed cases specific to an institution could be used. We expect that medical students in subsequent years would also benefit from this approach, given the hands-on nature of the workshop and its relevance to the medical field. Moreover, this curriculum can be adopted by or adapted to other health care professions. In the future, we aim to include students from other health care professions, such as nursing, pharmacy, and social work, in the workshop to incorporate an interprofessional component. A. Introduction to QIPS Workshop Overview Lecture.pptx B. Mr. Potato Head Session Materials.pptx C. Mr. Potato Head Session Worksheet.pdf D. Mr. Potato Head Session Faculty Guide.docx E. Case Discussion Session Presentation.pptx F. Case Discussion Session Worksheet.docx G. Discussion With Fourth-Year Medical Student Prompts.docx H. QIPS Pretest.docx I. QIPS Posttest.docx J. Workshop Curriculum Evaluation.docx All appendices are peer reviewed as integral parts of the Original Publication.
  15 in total

Review 1.  Methodological rigor of quality improvement curricula for physician trainees: a systematic review and recommendations for change.

Authors:  Donna M Windish; Darcy A Reed; Romsai T Boonyasai; Chayan Chakraborti; Eric B Bass
Journal:  Acad Med       Date:  2009-12       Impact factor: 6.893

Review 2.  Republished: Key characteristics of successful quality improvement curricula in physician education: a realist review.

Authors:  Anne C Jones; Scott A Shipman; Greg Ogrinc
Journal:  Postgrad Med J       Date:  2015-02       Impact factor: 2.401

3.  The Quality Improvement Knowledge Application Tool Revised (QIKAT-R).

Authors:  Mamta K Singh; Greg Ogrinc; Karen R Cox; Mary Dolansky; Julie Brandt; Laura J Morrison; Beth Harwood; Greg Petroski; Al West; Linda A Headrick
Journal:  Acad Med       Date:  2014-10       Impact factor: 6.893

4.  The importance of quality improvement education for medical students.

Authors:  Burton Shen; Luba Dumenco; Richard Dollase; Paul George
Journal:  Med Educ       Date:  2016-05       Impact factor: 6.251

5.  The WHO patient safety curriculum guide for medical schools.

Authors:  Merrilyn Walton; Helen Woodward; Samantha Van Staalduinen; C Lemer; F Greaves; D Noble; B Ellis; L Donaldson; B Barraclough
Journal:  Qual Saf Health Care       Date:  2010-12

Review 6.  Teaching quality improvement and patient safety to trainees: a systematic review.

Authors:  Brian M Wong; Edward E Etchells; Ayelet Kuper; Wendy Levinson; Kaveh G Shojania
Journal:  Acad Med       Date:  2010-09       Impact factor: 6.893

Review 7.  Quality improvement in medical education: current state and future directions.

Authors:  Brian M Wong; Wendy Levinson; Kaveh G Shojania
Journal:  Med Educ       Date:  2012-01       Impact factor: 6.251

8.  Web-based education in systems-based practice: a randomized trial.

Authors:  B Price Kerfoot; Paul R Conlin; Thomas Travison; Graham T McMahon
Journal:  Arch Intern Med       Date:  2007-02-26

9.  Planning and implementing a systems-based patient safety curriculum in medical education.

Authors:  David A Thompson; James Cowan; Christine Holzmueller; Albert W Wu; Eric Bass; Peter Pronovost
Journal:  Am J Med Qual       Date:  2008 Jul-Aug       Impact factor: 1.852

10.  Patient safety and quality improvement education: a cross-sectional study of medical students' preferences and attitudes.

Authors:  Claire L Teigland; Rachel C Blasiak; Lindsay A Wilson; Rachel E Hines; Karen L Meyerhoff; Anthony J Viera
Journal:  BMC Med Educ       Date:  2013-02-05       Impact factor: 2.463

View more
  5 in total

1.  Student-Led Adaptation of Improvement Science Learning During the COVID-19 Pandemic.

Authors:  Sherry Liang; Linh Nhat Taylor; Reem Hasan
Journal:  PRiMER       Date:  2020-09-16

2.  Analysis of patient safety messages delivered and received during clinical rounds.

Authors:  Diane Levine; Jaya Gadivemula; Raya Kutaimy; Srinivasa Kamatam; Nagaratna Sarvadevabatla; Prateek Lohia
Journal:  BMJ Open Qual       Date:  2020-07

3.  Expanding Training in Quality Improvement and Patient Safety Through a Multispecialty Graduate Medical Education Curriculum Designed for Fellows.

Authors:  Anna Neumeier; Andrew E Levy; Emily Gottenborg; Tyler Anstett; Read G Pierce; Darlene Tad-Y
Journal:  MedEdPORTAL       Date:  2020-12-30

4.  Identifying and Analyzing Systems Failures: An Interactive, Experiential Learning Approach to Quality Improvement for Clerkship-Level Medical Students.

Authors:  Galina Gheihman; Brent P Forester; Niraj Sharma; Cynthia So-Armah; Kathleen A Wittels; Tracey A Milligan
Journal:  MedEdPORTAL       Date:  2021-04-30

5.  Training Medical Students to Create and Collaboratively Review Multiple-Choice Questions: A Comprehensive Workshop.

Authors:  Josh Kurtz; Beth Holman; Seetha U Monrad
Journal:  MedEdPORTAL       Date:  2020-10-06
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.