Literature DB >> 34988291

Building a Shared Mental Model of Competence Across the Continuum: Trainee Perceptions of Subinternships for Residency Preparation.

Johannah M Scheurer1, Cynthia Davey2, Anne G Pereira3, Andrew P J Olson1,4.   

Abstract

INTRODUCTION: Toward a vision of competency-based medical education (CBME) spanning the undergraduate to graduate medical education (GME) continuum, University of Minnesota Medical School (UMMS) developed the Subinternship in Critical Care (SICC) offered across specialties and sites. Explicit course objectives and assessments focus on internship preparedness, emphasizing direct observation of handovers (Core Entrustable Professional Activity, "EPA," 8) and cross-cover duties (EPA 10).
METHODS: To evaluate students' perceptions of the SICC's and other clerkships' effectiveness toward internship preparedness, all 2016 and 2017 UMMS graduates in GME training (n = 440) were surveyed regarding skill development and assessment among Core EPAs 1, 4, 6, 8, 9, 10. Analysis included descriptive statistics plus chi-squared and Kappa agreement tests.
RESULTS: Respondents (n = 147, response rate 33%) rated the SICC as a rotation during which they gained most competence among EPAs both more (#4, 57% rated important; #8, 75%; #10, 70%) and less explicit (#6, 53%; #9, 69%) per rotation objectives. Assessments of EPA 8 (80% rated important) and 10 (76%) were frequently perceived as important toward residency preparedness. Agreement between importance of EPA development and assessment was moderate (Kappa = 0.40-0.59, all surveyed EPAs).
CONCLUSIONS: Graduates' perceptions support the SICC's educational utility and assessments. Based on this and other insight from the SICC, the authors propose implications toward collectively envisioning the continuum of physician competency.
© The Author(s) 2021.

Entities:  

Keywords:  competency-based medical education; direct observation; entrustable professional activity; residency preparedness; subinternship; undergraduate medical education

Year:  2021        PMID: 34988291      PMCID: PMC8721691          DOI: 10.1177/23821205211063350

Source DB:  PubMed          Journal:  J Med Educ Curric Dev        ISSN: 2382-1205


Introduction

Multiple stakeholders seek to best prepare medical school graduates for the transition to residency. Medical school graduates fear they will not be ready for internship.[1,2] Residency program directors share these concerns about their trainees’ variable preparedness.[3-6] Undergraduate medical school educators must bridge these gaps. Attention toward improved competency-based medical education (CBME) across curriculum, assessment, and advancement criteria will help. Clinical experiences in a required subinternship can enhance students’ residency preparedness. One CBME operational construct is the Core Entrustable Professional Activities for Entering Residency (Core EPAs) as set by the American Association of Medical Colleges (AAMC). On day 1 of residency, interns should be able to provide appropriate, safe care for patients with indirect supervision from a supervising physician per the 13 Core EPAs. In Canada, a similar endeavor utilizing EPA framework is underway to “smooth the transition points” between medical school and residency. At least one medical school has transformed its entire undergraduate medical education (UME) curriculum, assessment, and graduation framework around Core EPAs; most others are in exploratory or early phases of doing so given multiple challenges and calls for more evidence supporting this transformation. The University of Minnesota Medical School (UMMS) has been taking a stepwise approach to enhance CBME utilizing Core EPA framework. Considering the continuum of competency from UME to graduate medical education (GME), in 2015 a new, required clerkship, the Subinternship in Critical Care (SICC), was introduced. Garnering perspectives among multiple stakeholders (ie UME and GME educators and trainees) has been critical to clerkship development, implementation, and evaluation. Historically, medical education has utilized time-based curriculum and advancement: after completing required coursework, a student graduates and transitions to residency. Some argue the expectations for medical school curricula set forth by accrediting institutions do not ensure adequate preparation for the requisite expected level of supervision for UME graduates beginning residency. The Core EPAs establish a framework for assessing expected behaviors and abilities of medical students before graduation. An overarching goal is for trainees, educators, and the public to be assured that medical school graduates are entrustable under indirect supervision for all 13 Core EPAs and thus ready to begin internship. Residency program directors among multiple specialties perceive lack of adequate preparation for typical duties interns should be able to perform on the first day of residency; specifically, they lack confidence in their trainee's preparedness across all Core EPAs.[4,12] For example, surgical residency program directors have consistently expressed a lack of confidence in new surgical interns’ abilities.[3,4] Frayha et al reported program directors among multiple specialties were least confident in trainee abilities for Core EPAs 3 (recommending/interpreting tests), 4 (orders/prescriptions), 11 (informed consent), 12 (procedures), and 13 (safety culture). Likewise, students nearing the end of UME and trainees in GME reflecting on their UME experiences express concern about their own lack of internship preparedness. A report of over 20,000 surveyed internal medicine residents details the importance of skills developed in medical school for their residency preparedness. The three skills most commonly rated “very important” were: knowing when to seek additional help, task prioritization/efficiency, and communicating around care transitions. Further, a subinternship was among the most commonly rated clerkships as one of the three most valuable toward residency preparedness. Surgical and other trainees rated higher confidence for all 13 Core EPAs than did program directors, yet we found no reports related to Core EPAs for which students or graduates report full confidence in their performance when starting internship. Frayha et al also included residents from 17 specialties and showed their confidence was particularly lower for Core EPAs 4 (orders/prescriptions), 8 (handovers), 11 (informed consent), 12 (procedures), and 13 (safety culture). We share here our experience with the SICC, specifically perceptions from UMMS graduates regarding how important the SICC and other clerkships were to their residency preparedness and success. We then contribute our perspective to the vision for ensured development and assessment of competence along the continuum of medical education.

Methods

Given concern regarding adequate internship preparedness and efforts toward operationalizing CBME at UMMS, the innovative SICC was developed and implemented. Development of competence with respect to some Core EPAs had been fairly well-established in the UMMS curriculum (eg Core EPAs 2—differential diagnosis and 6—oral presentation). The SICC was designed to provide opportunities for students’ growth in less well-developed Core EPAs according to gaps identified by faculty plus available reports in the literature, as above. Since development of the SICC, others, including students, have reiterated these gaps.[14,15] The UMMS SICC maintains the overall expectation that students perform at the level of an intern before clerkship completion (with appropriate logistical restrictions) while caring for patients requiring high acuity care. More specifically, while students may practice entering orders and prescriptions (Core EPA 4) on other rotations, the SICC is the only rotation during which doing so and progressing to entrustment at the level of indirect supervision is an expectation among students and supervisors alike. A required portfolio, including assessment for explicit Core EPAs specifically emphasized during the SICC given gaps identified among other rotations, must be completed to pass the clerkship and includes: Students and GME faculty alike desire advanced UME experience within or closely related to a trainee's residency specialty, thus an important feature of the UMMS SICC is offering the clerkship among multiple clinical sites and specialties. The sites include medical ICUs (multiple hospitals), surgical ICUs (multiple hospitals), a pediatric ICU, a neonatal ICU, and an advanced general medicine experience. At least four direct observations by a supervisor of patient handovers entrusted at or beyond the level of indirect supervision using a tool designed by Aylward and colleagues (Core EPA 8). Completion of Teamwork Mini-Clinical Evaluation Exercise with a non-provider team member (Core EPA 9). At least four direct observations by a supervisor of patient care cross-cover/night shift duties entrusted at or beyond the level of indirect supervision (Core EPA 10). Personal reflection on a significant event during the clerkship. The relevance of competence gained during the SICC for GME training is fundamentally important. To evaluate the SICC's educational effectiveness related to internship preparedness according to the graduates themselves, in 2018 we surveyed all medical school graduates of UMMS who were in a GME training program from the classes of 2016 and 2017, the first two classes for whom the SICC was a required clerkship. The University of Minnesota Institutional Review Board deemed the project exempt from review. Two of the authors (JMS, APJO) developed an anonymous survey in Qualtrics (Supplemental Material 1) after reviewing literature and hosting informal discussions with residents and students on their SICC. No previously published surveys or reports on our outcomes of interest were found on review at that time. We were interested in perspectives and asked about both Core EPAs for which SICC objectives are more explicit and unique to the SICC: plus other Core EPAs more broadly part of the SICC and many other clerkships: Notably, EPAs, as with all workplace-based assessments, are context-dependent. EPAs more explicit for the SICC were chosen given curricular gaps and commonality across critical care settings; certainly, other EPAs may be more or less emphasized across critical care sites. As such, the first section of questions regarded to what extent competence was gained for each of these Core EPAs during specific rotations (all required rotations, including the SICC). For each Core EPA on the survey, respondents could choose up to five clerkships during which they gained most versus some versus did not gain competence. The next section of questions was specific to the SICC and regarded skill development and assessment among the same Core EPAs of interest. The survey was vetted with local experts (SICC clerkship faculty, a group of pediatrics education faculty, and the interim assistant dean of assessment and evaluation) and tested with medicine and pediatrics chief residents; no formal validity evidence was collected for this pilot project. Analysis was performed using SAS version 9.4 (SAS Institute, Cary, NC). Core EPA 4—enter and discuss orders/prescriptions Core EPA 8—give/receive patient handover Core EPA 10—recognize patient requiring urgent/emergent care Core EPA 1—history and physical exam Core EPA 6—oral presentation of clinical encounter Core EPA 9—interprofessional team collaboration.

Results

A total of 147 graduates (33% of the 440 UMMS graduates who matched into GME training in 2016 or 2017; here on referred to as “graduates”) responded to the survey. The anonymous survey respondents’ characteristics were compared to an estimated group of nonrespondents (Table 1). The number of nonrespondents for each characteristic was calculated from the total number of graduates for that characteristic minus the sum of the number of survey respondents plus a weighted count of missing or prefer not to answer responses (weighted according to survey response distribution). Chi-square tests were completed to compare survey respondents to estimated nonrespondents. There were no gender differences. Respondents were more likely than nonrespondents to be 2017 versus 2016 graduates (62% vs 41%; P < 0.001). A medical ICU site was the most common clinical site for the SICC (48% of graduates), and survey respondents were additionally more likely than nonrespondents to have completed their SICC in a medical ICU (61% vs 42%, P= 0.001). Matched specialty appears similar among respondents and nonrespondents, though no formal statistical test was appropriate for this comparison given the small cell counts for the less frequently represented specialties.
Table 1.

Participant characteristics, Core Entrustable Professional Activities for Entering Residency (Core EPA) development, and assessment survey.

CharacteristicAll graduates (n = 440) a Survey respondents (n = 147) a Nonrespondents b (n = 293) a P value c
Graduation year n (%)<.001

2016

228 (51.8)45 (38.1)172 (58.7)

2017

212 (48.2)73 (61.9)121 (41.3)
Missing029
N for denominator440118293
Gender n (%).714 d

Male

230 (52.3)58 (50.9)155 (53.0)

Female

210 (47.7)56 (49.1)138 (47.0)

Prefer not to answer

05
Missing028
N for denominator440114293
INMD 7900 Subinternship location n (%).001

General Medicine

77 (17.5)18 (14.8)55 (18.9)

Medical ICU

213 (48.4)74 (60.7)124 (42.3)

Surgical ICU

80 (18.2)19 (15.6)57 (19.5)

Pediatric ICU

20 (4.5)7 (5.7)12 (3.9)

Neonatal ICU

30 (6.8)4 (3.3)25 (8.6)

Unknown

20 (4.5)0 (0)20 (6.8)
Missing025
N for denominator440122293
Part of an LIC n (%)

No

353 (80.2)92 (75.4)242 (82.6).093 e

Yes

87 (19.8)30 (24.6)51 (17.4)
Missing025
N for denominator440122293
Matched specialty n (%)NA f

Family Medicine

83 (19.1)25 (20.5)53 (18.4)

Internal Medicine

84 (19.4)24 (19.7)55 (19.2)

Pediatrics

34 (7.8)8 (6.6)24 (8.5)

Emergency Medicine

29 (6.7)8 (6.6)19 (6.7)

Surgical subspecialty g

43 (9.9)11 (9.0)30 (10.4)

Radiology h

22 (5.1)6 (4.9)15 (5.1)

Anesthesia

20 (4.6)6 (4.9)13 (4.4)

Surgery

26 (6.0)5 (4.1)20 (7.0)

Medicine—Pediatrics

12 (2.8)5 (4.1)6 (2.1)

Obstetrics and Gynecology

21 (4.8)4 (3.3)16 (5.6)

Neurology, Child Neurology

9 (2.1)4 (3.3)4 (1.5)

Psychiatry

22 (5.1)9 (7.4)11 (3.9)

Dermatology

7 (1.6)1 (0.8)6 (2.0)

Pathology

8 (1.8)0 (0.0)8 (2.8)

Other i

14 (3.2)6 (4.9)7 (2.4)
Unknown/missing625
N for denominator434122287

All participant characteristic data was missing for 25 survey respondents who did not complete the entire survey. ”Prefer not to answer” and missing responses were not included in total or denominators for percentage calculations.

Estimated nonrespondent n for each characteristic level was calculated as ”ll graduates n” minus (”respondent n” + ”weighted count of missing and/or prefer not to answer”) where missing/prefer not to answer responses were weighted according to the survey respondent distribution.

P value for chi-square tests comparing respondents and nonrespondents (estimated as above).

Chi-square test limited to those who responded Male or Female.

Chi-square test limited to Yes or No response.

Chi-square test not run due to small expected cell counts for some levels of matched specialty.

Includes Neurosurgery, Ophthalmology, Orthopedic Surgery, Otolaryngology, Plastic Surgery, Urology.

Includes Diagnostic Radiology, Interventional Radiology, Radiation Oncology.

Includes Physical Medicine and Rehabilitation, Emergency Medicine—Internal Medicine, Internal Medicine—Psychiatry, Transitional/Preliminary internship only, Other (unspecified).

Participant characteristics, Core Entrustable Professional Activities for Entering Residency (Core EPA) development, and assessment survey. 2016 2017 Male Female Prefer not to answer General Medicine Medical ICU Surgical ICU Pediatric ICU Neonatal ICU Unknown No Yes Family Medicine Internal Medicine Pediatrics Emergency Medicine Surgical subspecialty Radiology Anesthesia Surgery Medicine—Pediatrics Obstetrics and Gynecology Neurology, Child Neurology Psychiatry Dermatology Pathology Other All participant characteristic data was missing for 25 survey respondents who did not complete the entire survey. ”Prefer not to answer” and missing responses were not included in total or denominators for percentage calculations. Estimated nonrespondent n for each characteristic level was calculated as ”ll graduates n” minus (”respondent n” + ”weighted count of missing and/or prefer not to answer”) where missing/prefer not to answer responses were weighted according to the survey respondent distribution. P value for chi-square tests comparing respondents and nonrespondents (estimated as above). Chi-square test limited to those who responded Male or Female. Chi-square test limited to Yes or No response. Chi-square test not run due to small expected cell counts for some levels of matched specialty. Includes Neurosurgery, Ophthalmology, Orthopedic Surgery, Otolaryngology, Plastic Surgery, Urology. Includes Diagnostic Radiology, Interventional Radiology, Radiation Oncology. Includes Physical Medicine and Rehabilitation, Emergency Medicine—Internal Medicine, Internal Medicine—Psychiatry, Transitional/Preliminary internship only, Other (unspecified). For the first set of questions about all rotations, graduates frequently perceived the SICC among all surveyed Core EPAs as one of the five during which most Core EPA competence was gained, as shown on Figure 1. The SICC was selected more frequently than any other clerkship as a clerkship during which most competence was gained for Core EPAs 4 (orders/prescriptions, 57% of respondents), 8 (handovers, 75% of respondents), and 9 (interprofessional team, 69% of respondents). The SICC was also very frequently selected for Core EPA 10 (recognizing patient requiring urgent/emergent care, 70% of respondents), though less frequently than for Emergency Medicine (89% of respondents). Notably, three other clerkships were frequently ranked as a clerkship during which most competence was gained for surveyed Core EPAs: Internal Medicine (all six surveyed), Pediatrics (five of six surveyed), and Emergency Medicine (five of six surveyed).
Figure 1.

Skill development—top fiveclerkships selected for “I gained MOST competence for this EPA in…” Respondents (n = 147) could select up to five clerkships to complete this phrase from a menu of all required and fill in choice for elective clerkships at University of Minnesota Medical School (UMMS) for each Core Entrustable Professional Activities for Entering Residency (Core EPA) surveyed (1, 4, 6, 8, 9, 10). Frequency of responses per clerkship as a total of all respondents is shown for all required clerkships for each surveyed Core EPA.

Skill development—top fiveclerkships selected for “I gained MOST competence for this EPA in…” Respondents (n = 147) could select up to five clerkships to complete this phrase from a menu of all required and fill in choice for elective clerkships at University of Minnesota Medical School (UMMS) for each Core Entrustable Professional Activities for Entering Residency (Core EPA) surveyed (1, 4, 6, 8, 9, 10). Frequency of responses per clerkship as a total of all respondents is shown for all required clerkships for each surveyed Core EPA. For the next section specific to the SICC, when asked how important for success in residency was their skill development for each Core EPA during SICC, a majority of respondents rated the SICC as “very” or “extremely” important among all surveyed Core EPAs. This was reported with a somewhat higher frequency for Core EPA 6 (79% respondents), 8 (81% respondents), and 10 (82% respondents) than for Core EPA 1, 4, or 9 (63%-75% respondents). The survey also addressed the importance toward residency success of Core EPA assessment during the SICC. Most respondents (at least two-thirds for each Core EPA) reported importance of Core EPA assessment during the SICC toward residency preparedness. Notably, the majority of respondents indicated that Core EPA development and assessment were both important for residency preparedness, including for Core EPAs that are (Core EPAs 4, 8, 10) and are not explicit in SICC objectives plus those that are (Core EPAs 8 and 10) and are not part of required direct observation assessments, as shown in Figure 2. There was moderate agreement between rated importance for Core EPA skill development and assessment during the SICC (Kappa = 0.40-0.59 for all surveyed EPAs).
Figure 2.

Importance of Core Entrustable Professional Activities for Entering Residency (Core EPA) skill development and/or assessment during the Subinternship in Critical Care (SICC). Respondents (n = 147) were asked how important for success in residency was the development (D) and assessment (A) of each surveyed Core EPA (1, 4, 6, 8, 9, 10) during their SICC. The figure displays the percent of total respondents who selected “very” or “extremely” important for skill development and/or assessment or neither.

Importance of Core Entrustable Professional Activities for Entering Residency (Core EPA) skill development and/or assessment during the Subinternship in Critical Care (SICC). Respondents (n = 147) were asked how important for success in residency was the development (D) and assessment (A) of each surveyed Core EPA (1, 4, 6, 8, 9, 10) during their SICC. The figure displays the percent of total respondents who selected “very” or “extremely” important for skill development and/or assessment or neither. Finally, Figure 3 shows respondents’ reported perceptions of whether or not they had been assessed during the SICC for each surveyed Core EPA. While 100% of graduates needed to document assessment via direct observation for Core EPAs 8 and 10 to pass the SICC, 86% of respondents perceived that assessment was done for Core EPA 8 (handovers) and 72% for Core EPA 10 (cross-cover). The Core EPA for which the fewest respondents (64%) perceived having been assessed during the SICC was EPA 4 (enter orders/prescriptions).
Figure 3.

Graduate perceptions of Core Entrustable Professional Activities for Entering Residency (Core EPA) assessment during the Subinternship in Critical Care (SICC). Respondents (n = 147) were asked whether or not each surveyed Core EPA (1, 4, 6, 8, 9, 10) was assessed during their SICC. The figure displays the percent who responded “yes.”

Graduate perceptions of Core Entrustable Professional Activities for Entering Residency (Core EPA) assessment during the Subinternship in Critical Care (SICC). Respondents (n = 147) were asked whether or not each surveyed Core EPA (1, 4, 6, 8, 9, 10) was assessed during their SICC. The figure displays the percent who responded “yes.”

Discussion

Overall, the results from the graduate survey support the SICC's educational utility, specifically toward internship preparedness as perceived by the trainees themselves. Certainly, the survey results and interpretations have limitations, as detailed below, yet still provide evidence and impetus for further consideration toward shared mental models of assessment among students and myriad assessors, alike.

Study Limitations

Some study limitations were related to the low survey response rate. Nonresponse bias is certainly possible and limits the generalizability of results. Specifically of note, respondents and nonrespondents differed in graduation year and proportion of medical ICU site for SICC. However, GME matched specialty representation was similar among respondents and nonrespondents and may somewhat limit the possibility of nonresponse bias. The sample size precluded our ability to further stratify findings (eg by SICC site, trainee matched specialty). Nevertheless, these graduates’ perceptions of surveyed Core EPA development during the SICC compared to all required clerkships strongly indicate the SICC educational opportunities and assessments were important for Core EPA development including toward residency success. Recall bias, including related to respondents’ memories and attitudes regarding UMMS after graduation, may have influenced graduates’ responses after completing the SICC. Yet another likely factor specifically related to graduate perceptions regarding whether or not Core EPA assessment occurred is poor agreement between trainee and supervisor on what constitutes an assessment. Finally, findings are further subjective given very likely variable expectations for intern performance of Core EPAs among the represented GME institutions, specialties, and programs.

The Role of Workplace-Based Assessments in Improving Residency Preparedness

Framework for competency-based assessment of student performance must be inherent in curriculum planned around the Core EPAs including input from multiple assessors at multiple time points that encourages direct observation, as in the SICC required portfolio. However, our survey results suggest these graduates did not consistently recall or perceive having been assessed (yes vs no response) on the surveyed Core EPAs (Figure 3). If we asked supervisors “were students assessed for these Core EPAs during the SICC?,” we anticipate higher reported yes percentages than the graduates. For example, students are expected to enter orders for co-signature during the SICC, yet only two-thirds (64%) of students perceived having received assessment on this during the SICC. We have not formally tracked adherence to this expectation; our own perception is that greater than 64% of students are doing this. Rather, trainees may not perceive that when a supervisor co-signs their order, he or she is assessing the student: whether by signing it unchanged versus modifying it versus cancelling it and putting in a new order. Students and supervisors clearly hold different mental models of what denotes assessment. Particularly: is something without a grade an assessment? Ultimately, the standard end of rotation assessments (ie AAMC PCRS evaluations and examination scores) constituting students’ academic records are typically summative. Supervisors’ day-to-day formative assessments may be lost. Therefore, our findings impose an important consideration: a better tool alone does not equate to better assessment. Assessors must be equipped to use the tool and give feedback and trainees must be primed for its potentially meaningful use.[20,21] Findings regarding perceptions of what was assessed during the SICC have informed serious deliberation locally toward improving shared mental models of competency assessment among students, supervisors, and UMMS education leadership.

Challenges of a Multispecialty Subinternship

Consistent SICC expectations across specialties and sites have undoubtedly improved the previous fragmented, incongruous experiences. Further, among certain specialties, students did not previously have opportunities aimed specifically at a subinternship experience. However, challenges exist in meeting the aim to provide similar experiences across sites. Expectations and opinions differing among specialties and institutions drive discussion about assessment of students including SICC Core EPA entrustment during monthly site director meetings. These differences could be part of differing perceptions from surveyed graduates. Summative evaluations are completed (adaptation of AAMC's Physician Competency Reference Set, “PCRS”), as per all UMMS clinical clerkships, and students are required to submit the portfolio items. The vast majority of students receive honors grades for the SICC with the current grading system, raising concern about grade inflation, which has been reported for subinternship experiences. Some faculty believe further discernment among student performance (more normally distributed grades) is important for residency application purposes. For example, defined honors criteria on surgical clerkships have been formulated. Others challenge whether surgical clerkship performance can truly (or even should) be a discrete, quantified grade.[25,26] Likewise, other SICC faculty assert a more accurate, fair representation of CBME would be a pass/fail grading system, also called for by others. While the graduate survey findings here and our ongoing local discussion do not delineate a path forward, the need for a shared and distinct definition of competence is evident. Notably, further formal program evaluation including gathering similar survey information among supervisors (eg clerkship site faculty and trainees) and education leaders (eg rotation and site directors, GME program directors) was not part of this pilot survey project but should be considered.

Conclusions

A Vision for Promoting Competence Along the Continuum

As educators seeking to prepare UME graduates for success in residency and beyond, our graduate's perceptions regarding Core EPA development and assessment during clerkships at UMMS have strengthened our drive toward improved CBME among all clerkships while consistently incorporating the student perspective. We urge others to consider this approach. Based on these graduates’ perceptions, integral Core EPA skill development occurred during a limited number of required clerkships. Faculty perceives meaningful educational objectives for every required clerkship. Certainly, some of those objectives include development of Core EPAs not surveyed. However, we must examine what about these frequently cited clerkships makes them more relevant for students than the less frequently cited clerkships (eg clerkship objectives and expectations, workplace environment, amount and quality of feedback, students’ perceived duties and responsibilities on a team). If a clerkship does not build meaningful competence, its inclusion in the curriculum should be reconsidered. A subinternship experience is important toward residency preparedness and, ultimately, success based on our graduates’ perceptions and others.[13,28-30] Better formative assessment including multiple direct observations by prepared supervisors will promote student development toward Core EPA entrustment to perform with indirect supervision before graduation. Yet without shared mental models of assessment, we will still fall short. We must better understand what assessments are important toward promoting students’ UME skill development and catalyzing the transition to residency, and then we must equip all trainees to seek and all supervisors to accomplish those assessments. We envision that an improved, mutual model of formative and summative assessment throughout UME, including during the SICC, will further enhance trainee skill development and lead to better residency preparation as perceived by students and faculty alike.
  29 in total

1.  Toward a common taxonomy of competency domains for the health professions and competencies for physicians.

Authors:  Robert Englander; Terri Cameron; Adrian J Ballard; Jessica Dodge; Janet Bull; Carol A Aschenbrener
Journal:  Acad Med       Date:  2013-08       Impact factor: 6.893

Review 2.  Toward meaningful evaluation of medical trainees: the influence of participants' perceptions of the process.

Authors:  Christopher J Watling; Lorelei Lingard
Journal:  Adv Health Sci Educ Theory Pract       Date:  2010-02-09       Impact factor: 3.853

3.  Defining "Honors": A Losing Proposition.

Authors:  Stephen M Kavic
Journal:  J Am Coll Surg       Date:  2017-02       Impact factor: 6.113

4.  Defining Honors in the Surgery Clerkship.

Authors:  Jeremy M Lipman; Kimberly D Schenarts
Journal:  J Am Coll Surg       Date:  2016-07-29       Impact factor: 6.113

5.  Criterion-Based Assessment in a Norm-Based World: How Can We Move Past Grades?

Authors:  Anne G Pereira; Majka Woods; Andrew P J Olson; Suzanne van den Hoogenhof; Briar L Duffy; Robert Englander
Journal:  Acad Med       Date:  2018-04       Impact factor: 6.893

6.  Smoothing the Transition Points in Canadian Medical Education.

Authors:  Nick Busing; Jay Rosenfield; Kamal Rungta; Matt Raegele; Andrew Warren; Bruce Wright; Mark Walton; Ivy Oandasan; Anthony Sanfilippo; Anurag Saxena
Journal:  Acad Med       Date:  2018-05       Impact factor: 6.893

7.  Describing the Journey and Lessons Learned Implementing a Competency-Based, Time-Variable Undergraduate Medical Education Curriculum.

Authors:  George C Mejicano; Tracy N Bumsted
Journal:  Acad Med       Date:  2018-03       Impact factor: 6.893

8.  Preparedness for clinical practice: reports of graduating residents at academic health centers.

Authors:  D Blumenthal; M Gokhale; E G Campbell; J S Weissman
Journal:  JAMA       Date:  2001-09-05       Impact factor: 56.272

9.  Guidelines: The do's, don'ts and don't knows of direct observation of clinical skills in medical education.

Authors:  Jennifer R Kogan; Rose Hatala; Karen E Hauer; Eric Holmboe
Journal:  Perspect Med Educ       Date:  2017-10

10.  Program Director Perceptions of Proficiency in the Core Entrustable Professional Activities.

Authors:  R Ellen Pearlman; Melissa Pawelczak; Andrew C Yacht; Salaahuddin Akbar; Gino A Farina
Journal:  J Grad Med Educ       Date:  2017-10
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.