Literature DB >> 28093833

Cracking the code: residents' interpretations of written assessment comments.

Shiphra Ginsburg1, Cees Pm van der Vleuten2, Kevin W Eva3, Lorelei Lingard4.   

Abstract

CONTEXT: Interest is growing in the use of qualitative data for assessment. Written comments on residents' in-training evaluation reports (ITERs) can be reliably rank-ordered by faculty attendings, who are adept at interpreting these narratives. However, if residents do not interpret assessment comments in the same way, a valuable educational opportunity may be lost.
OBJECTIVES: Our purpose was to explore residents' interpretations of written assessment comments using mixed methods.
METHODS: Twelve internal medicine (IM) postgraduate year 2 (PGY2) residents were asked to rank-order a set of anonymised PGY1 residents (n = 48) from a previous year in IM based solely on their ITER comments. Each PGY1 was ranked by four PGY2s; generalisability theory was used to assess inter-rater reliability. The PGY2s were then interviewed separately about their rank-ordering process, how they made sense of the comments and how they viewed ITERs in general. Interviews were analysed using constructivist grounded theory.
RESULTS: Across four PGY2 residents, the G coefficient was 0.84; for a single resident it was 0.56. Resident rankings correlated extremely well with faculty member rankings (r = 0.90). Residents were equally adept at reading between the lines to construct meaning from the comments and used language cues in ways similarly reported in faculty attendings. Participants discussed the difficulties of interpreting vague language and provided perspectives on why they thought it occurs (time, discomfort, memorability and the permanency of written records). They emphasised the importance of face-to-face discussions, the relative value of comments over scores, staff-dependent variability of assessment and the perceived purpose and value of ITERs. They saw particular value in opportunities to review an aggregated set of comments.
CONCLUSIONS: Residents understood the 'hidden code' in assessment language and their ability to rank-order residents based on comments matched that of faculty. Residents seemed to accept staff-dependent variability as a reality. These findings add to the growing evidence that supports the use of narrative comments and subjectivity in assessment.
© 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

Entities:  

Mesh:

Year:  2017        PMID: 28093833     DOI: 10.1111/medu.13158

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  13 in total

1.  Milestone Implementation's Impact on Narrative Comments and Perception of Feedback for Internal Medicine Residents: a Mixed Methods Study.

Authors:  Sonja E Raaum; Katie Lappe; Jorie M Colbert-Getz; Caroline K Milne
Journal:  J Gen Intern Med       Date:  2019-06       Impact factor: 5.128

2.  Learning Analytics in Medical Education Assessment: The Past, the Present, and the Future.

Authors:  Teresa Chan; Stefanie Sebok-Syer; Brent Thoma; Alyssa Wise; Jonathan Sherbino; Martin Pusic
Journal:  AEM Educ Train       Date:  2018-03-22

3.  Clinical Instructors' Perceptions of Internationally Educated Physical Therapists' Readiness to Practise during Supervised Clinical Internships in a Bridging Programme.

Authors:  Michael E Kalu; Sharon Switzer-Mclntrye; Martine Quesnel; Catherine Donnelly; Kathleen E Norman
Journal:  Physiother Can       Date:  2021       Impact factor: 1.037

4.  Concordance of Narrative Comments with Supervision Ratings Provided During Entrustable Professional Activity Assessments.

Authors:  Andrew S Parsons; Kelley Mark; James R Martindale; Megan J Bray; Ryan P Smith; Elizabeth Bradley; Maryellen Gusic
Journal:  J Gen Intern Med       Date:  2022-06-16       Impact factor: 6.473

5.  Advancing Our Understanding of Narrative Comments Generated by Direct Observation Tools: Lessons From the Psychopharmacotherapy-Structured Clinical Observation.

Authors:  John Q Young; Rebekah Sugarman; Eric Holmboe; Patricia S O'Sullivan
Journal:  J Grad Med Educ       Date:  2019-10

6.  Evaluation of a National Competency-Based Assessment System in Emergency Medicine: A CanDREAM Study.

Authors:  Brent Thoma; Andrew K Hall; Kevin Clark; Nazanin Meshkat; Warren J Cheung; Pierre Desaulniers; Cheryl Ffrench; Allison Meiwald; Christine Meyers; Catherine Patocka; Lorri Beatty; Teresa M Chan
Journal:  J Grad Med Educ       Date:  2020-08

7.  Workplace-based Assessment Data in Emergency Medicine: A Scoping Review of the Literature.

Authors:  Teresa M Chan; Stefanie S Sebok-Syer; Warren J Cheung; Martin Pusic; Christine Stehman; Michael Gottlieb
Journal:  AEM Educ Train       Date:  2020-11-05

Review 8.  The impact of patient feedback on the medical performance of qualified doctors: a systematic review.

Authors:  Rebecca Baines; Sam Regan de Bere; Sebastian Stevens; Jamie Read; Martin Marshall; Mirza Lalani; Marie Bryce; Julian Archer
Journal:  BMC Med Educ       Date:  2018-07-31       Impact factor: 2.463

9.  Developing a video-based method to compare and adjust examiner effects in fully nested OSCEs.

Authors:  Peter Yeates; Natalie Cope; Ashley Hawarden; Hannah Bradshaw; Gareth McCray; Matt Homer
Journal:  Med Educ       Date:  2018-12-21       Impact factor: 6.251

10.  Competencies and Feedback on Internal Medicine Residents' End-of-Rotation Assessments Over Time: Qualitative and Quantitative Analyses.

Authors:  Ara Tekian; Yoon Soo Park; Sarette Tilton; Patrick F Prunty; Eric Abasolo; Fred Zar; David A Cook
Journal:  Acad Med       Date:  2019-12       Impact factor: 6.893

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.