Literature DB >> 32201143

Concordance Between Expert and Nonexpert Ratings of Condensed Video-Based Trainee Operative Performance Assessment.

Rebecca E Scully1, Shanley B Deal2, Michael J Clark3, Katherine Yang3, Greg Wnuk4, Douglas S Smink1, Jonathan P Fryer5, Jordan D Bohnen6, Ezra N Teitelbaum5, Shari L Meyerson7, Andreas H Meier8, Paul G Gauger4, Rishindra M Reddy4, Daniel E Kendrick9, Michael Stern10, David T Hughes10, Jeffrey G Chipman11, Jitesh A Patel7, Adnan Alseidi2, Brian C George4.   

Abstract

OBJECTIVE: We examined the impact of video editing and rater expertise in surgical resident evaluation on operative performance ratings of surgical trainees.
DESIGN: Randomized independent review of intraoperative video.
SETTING: Operative video was captured at a single, tertiary hospital in Boston, MA. PARTICIPANTS: Six common general surgery procedures were video recorded of 6 attending-trainee dyads. Full-length and condensed versions (n = 12 videos) were then reviewed by 13 independent surgeon raters (5 evaluation experts, 8 nonexperts) using a crossed design. Trainee performance was rated using the Operative Performance Rating Scale, System for Improving and Measuring Procedural Learning (SIMPL) Performance scale, the Zwisch scale, and ten Cate scale. These ratings were then standardized before being compared using Bayesian mixed models with raters and videos treated as random effects.
RESULTS: Editing had no effect on the Operative Performance Rating Scale Overall Performance (-0.10, p = 0.30), SIMPL Performance (0.13, p = 0.71), Zwisch (-0.12, p = 0.27), and ten Cate scale (-0.13, p = 0.29). Additionally, rater expertise (evaluation expert vs. nonexpert) had no effect on the same scales (-0.16 (p = 0.32), 0.18 (p = 0.74), 0.25 (p = 0.81), and 0.25 (p = 0.17).
CONCLUSIONS: There is little difference in operative performance assessment scores when raters use condensed videos or when raters who are not experts in surgical resident evaluation are used. Future validation studies of operative performance assessment scales may be facilitated by using nonexpert surgeon raters viewing videos condensed using a standardized protocol.
Copyright © 2020 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

Entities:  

Keywords:  assessment; operative evaluation; residency training; video

Mesh:

Year:  2020        PMID: 32201143     DOI: 10.1016/j.jsurg.2019.12.016

Source DB:  PubMed          Journal:  J Surg Educ        ISSN: 1878-7452            Impact factor:   2.891


  2 in total

Review 1.  The association between video-based assessment of intraoperative technical performance and patient outcomes: a systematic review.

Authors:  Saba Balvardi; Anitha Kammili; Melissa Hanson; Carmen Mueller; Melina Vassiliou; Lawrence Lee; Kevin Schwartzman; Julio F Fiore; Liane S Feldman
Journal:  Surg Endosc       Date:  2022-05-12       Impact factor: 4.584

2.  Multicentric validation of EndoDigest: a computer vision platform for video documentation of the critical view of safety in laparoscopic cholecystectomy.

Authors:  Pietro Mascagni; Deepak Alapatt; Giovanni Guglielmo Laracca; Ludovica Guerriero; Andrea Spota; Claudio Fiorillo; Armine Vardazaryan; Giuseppe Quero; Sergio Alfieri; Ludovica Baldari; Elisa Cassinotti; Luigi Boni; Diego Cuccurullo; Guido Costamagna; Bernard Dallemagne; Nicolas Padoy
Journal:  Surg Endosc       Date:  2022-02-16       Impact factor: 4.584

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.