Literature DB >> 28205036

Video assessment of laparoscopic skills by novices and experts: implications for surgical education.

Celine Yeung1, Brian Carrillo2,3, Victor Pope4, Shahob Hosseinpour1, J Ted Gerstle2,3, Georges Azzie5,6.   

Abstract

BACKGROUND: Previous investigators have shown that novices are able to assess surgical skills as reliably as expert surgeons. The purpose of this study was to determine how novices and experts arrive at these graded scores when assessing laparoscopic skills and the potential implications this may have for surgical education.
METHODS: Four novices and four general laparoscopic surgeons evaluated 59 videos of a suturing task using a 5-point scale. Average novice and expert evaluator scores for each video and the average number of times that scores were changed were compared. Intraclass correlation coefficients were used to determine inter-rater and test-retest reliability. Evaluators were asked to define the number of videos they needed to watch before they could confidently grade and to describe how they were able to distinguish between different levels of expertise.
RESULTS: There were no significant differences in mean scores assigned by the two evaluator groups. Novices changed their scores more frequently compared to experts, but this did not reach statistical significance. There was excellent inter-rater reliability between the two groups (ICC = 0.91, CI 0.85-0.95) and good test-retest reliability (ICC > 0.83). On average, novices and experts reported that they needed to watch 13.8 ± 2.4 and 8.5 ± 2.5 videos, respectively, before they could confidently grade. Both groups also identified similar qualitative indicators (e.g., instrument control).
CONCLUSION: Evaluators with varying levels of expertise can reliably grade performance of an intracorporeal suturing task. While novices were less confident in their grading, both groups were able to assign comparable scores and identify similar elements of a suturing skill as being important in terms of assessment.

Keywords:  Laparoscopic; Novice evaluators; Suturing skill; Video assessment

Mesh:

Year:  2017        PMID: 28205036     DOI: 10.1007/s00464-017-5417-0

Source DB:  PubMed          Journal:  Surg Endosc        ISSN: 0930-2794            Impact factor:   4.584


  16 in total

1.  Measuring operative performance after laparoscopic skills training: edited videotape versus direct observation.

Authors:  D J Scott; R V Rege; P C Bergen; W A Guo; R Laycock; S T Tesfay; R J Valentine; D B Jones
Journal:  J Laparoendosc Adv Surg Tech A       Date:  2000-08       Impact factor: 1.878

2.  Toward reliable operative assessment: the reliability and feasibility of videotaped assessment of laparoscopic technical skills.

Authors:  D Dath; G Regehr; D Birch; C Schlachta; E Poulin; J Mamazza; R Reznick; H M MacRae
Journal:  Surg Endosc       Date:  2004-10-26       Impact factor: 4.584

3.  The impending shortage and the estimated cost of training the future surgical workforce.

Authors:  Thomas E Williams; Bhagwan Satiani; Andrew Thomas; E Christopher Ellison
Journal:  Ann Surg       Date:  2009-10       Impact factor: 12.969

4.  Group assessments of resident physicians improve reliability and decrease halo error.

Authors:  Matthew R Thomas; Thomas J Beckman; Karen F Mauck; Stephen S Cha; Kris G Thomas
Journal:  J Gen Intern Med       Date:  2011-03-03       Impact factor: 5.128

5.  Objective structured assessment of technical skill (OSATS) for surgical residents.

Authors:  J A Martin; G Regehr; R Reznick; H MacRae; J Murnaghan; C Hutchison; M Brown
Journal:  Br J Surg       Date:  1997-02       Impact factor: 6.939

6.  Crowd-Sourced Assessment of Technical Skills: Differentiating Animate Surgical Skill Through the Wisdom of Crowds.

Authors:  Daniel Holst; Timothy M Kowalewski; Lee W White; Timothy C Brand; Jonathan D Harper; Mathew D Sorensen; Mireille Truong; Khara Simpson; Alyssa Tanaka; Roger Smith; Thomas S Lendvay
Journal:  J Endourol       Date:  2015-05-26       Impact factor: 2.942

7.  Crowd-Sourced Assessment of Technical Skill: A Valid Method for Discriminating Basic Robotic Surgery Skills.

Authors:  Lee W White; Timothy M Kowalewski; Rodney Lee Dockter; Bryan Comstock; Blake Hannaford; Thomas S Lendvay
Journal:  J Endourol       Date:  2015-08-24       Impact factor: 2.942

8.  A study of crowdsourced segment-level surgical skill assessment using pairwise rankings.

Authors:  Anand Malpani; S Swaroop Vedula; Chi Chiung Grace Chen; Gregory D Hager
Journal:  Int J Comput Assist Radiol Surg       Date:  2015-06-30       Impact factor: 2.924

9.  Crowd-Sourced Assessment of Technical Skills: a novel method to evaluate surgical performance.

Authors:  Carolyn Chen; Lee White; Timothy Kowalewski; Rajesh Aggarwal; Chris Lintott; Bryan Comstock; Katie Kuksenok; Cecilia Aragon; Daniel Holst; Thomas Lendvay
Journal:  J Surg Res       Date:  2013-10-10       Impact factor: 2.192

10.  Crowd-sourced assessment of surgical skills in cricothyrotomy procedure.

Authors:  Nava Aghdasi; Randall Bly; Lee W White; Blake Hannaford; Kris Moe; Thomas S Lendvay
Journal:  J Surg Res       Date:  2015-03-18       Impact factor: 2.192

View more
  2 in total

1.  Simulation platforms to assess laparoscopic suturing skills: a scoping review.

Authors:  Elif Bilgic; Motaz Alyafi; Tomonori Hada; Tara Landry; Gerald M Fried; Melina C Vassiliou
Journal:  Surg Endosc       Date:  2019-05-14       Impact factor: 4.584

Review 2.  A scoping review of assessment tools for laparoscopic suturing.

Authors:  Elif Bilgic; Satoshi Endo; Ekaterina Lebedeva; Madoka Takao; Katherine M McKendy; Yusuke Watanabe; Liane S Feldman; Melina C Vassiliou
Journal:  Surg Endosc       Date:  2018-05-03       Impact factor: 4.584

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.