Literature DB >> 30450963

Crowdsourced Assessment of Ureteroscopy with Laser Lithotripsy Video Feed Does Not Correlate with Trainee Experience.

Simon L Conti1,2,3, William Brubaker1, Benjamin I Chung1, Mario Sofer4, Ryan S Hsi5, Rajesh Shinghal6, Christopher S Elliott1,7, Thomas Caruso3,8, John T Leppert1,2.   

Abstract

OBJECTIVES: We sought to validate the use of crowdsourced surgical video assessment in the evaluation of urology residents performing flexible ureteroscopic laser lithotripsy.
METHODS: We collected video feeds from 30 intrarenal ureteroscopic laser lithotripsy cases where residents, postgraduate year (PGY) two through six, handled the ureteroscope. The video feeds were annotated to represent overall performance and to contain parts of the procedure being scored. Videos were submitted to a commercially available surgical video evaluation platform (Crowd-Sourced Assessment of Technical Skills). We used a validated ureteroscopic laser lithotripsy global assessment tool that was modified to include only those domains that could be evaluated on the captured video. Videos were evaluated by crowd workers recruited using Amazon's Mechanical Turk platform as well as five endourology-trained experts. Mean scores were calculated and intraclass correlation coefficients (ICCs) were computed for the expert domain and total scores. ICCs were estimated using a linear mixed-effects model. Spearman rank correlation coefficients were calculated as a measure of the strength of the relationships between the crowd mean and expert average scores.
RESULTS: A total of 30 videos were reviewed 2488 times by 487 crowd workers and five expert endourologists. ICCs between expert raters were all below accepted levels of correlation (0.30), with the overall score having an ICC of <0.001. For individual domains, the crowd scores did not correlate with expert scores, except for the stone retrieval domain (0.60 p = 0.015). In addition, crowdsourced scores had a negative correlation with the PGY level (0.44, p = 0.019).
CONCLUSIONS: There is poor agreement between experts and poor correlation between expert and crowd scores when evaluating video feeds of ureteroscopic laser lithotripsy. The use of an intraoperative video of ureteroscopy with laser lithotripsy for assessment of resident trainee skills does not appear reliable. This is further supported by the lack of correlation between crowd scores and advancing PGY level.

Entities:  

Keywords:  assessment; crowd sourcing; kidney stones; laser lithotripsy; ureteroscopy

Mesh:

Year:  2018        PMID: 30450963     DOI: 10.1089/end.2018.0534

Source DB:  PubMed          Journal:  J Endourol        ISSN: 0892-7790            Impact factor:   2.942


  2 in total

1.  Crowdsourced Assessment of Surgical Skill Proficiency in Cataract Surgery.

Authors:  Grace L Paley; Rebecca Grove; Tejas C Sekhar; Jack Pruett; Michael V Stock; Tony N Pira; Steven M Shields; Evan L Waxman; Bradley S Wilson; Mae O Gordon; Susan M Culican
Journal:  J Surg Educ       Date:  2021-02-25       Impact factor: 2.891

2.  Is Female Wellness Affected When Men Blame Them for Erectile Dysfunction?

Authors:  Justin M Dubin; W Austin Wyant; Navin C Balaji; Iakov V Efimenko; Quinn C Rainer; Belen Mora; Lisa Paz; Ashley G Winter; Ranjith Ramasamy
Journal:  Sex Med       Date:  2021-05-29       Impact factor: 2.491

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.