BACKGROUND: Objective quantification of surgical skill is imperative as we enter a healthcare environment of quality improvement and performance-based reimbursement. The gold standard tools are infrequently used due to time-intensiveness, cost inefficiency, and lack of standard practices. We hypothesized that valid performance scores of surgical skill can be obtained through crowdsourcing. METHODS: Twelve surgeons of varying robotic surgical experience performed live porcine robot-assisted urinary bladder closures. Blinded video-recorded performances were scored by expert surgeon graders and by Amazon's Mechanical Turk crowdsourcing crowd workers using the Global Evaluative Assessment of Robotic Skills tool assessing five technical skills domains. Seven expert graders and 50 unique Mechanical Turkers (each paid $0.75/survey) evaluated each video. Global assessment scores were analyzed for correlation and agreement. RESULTS: Six hundred Mechanical Turkers completed the surveys in less than 5 hours, while seven surgeon graders took 14 days. The duration of video clips ranged from 2 to 11 minutes. The correlation coefficient between the Turkers' and expert graders' scores was 0.95 and Cronbach's Alpha was 0.93. Inter-rater reliability among the surgeon graders was 0.89. CONCLUSION: Crowdsourcing surgical skills assessment yielded rapid inexpensive agreement with global performance scores given by expert surgeon graders. The crowdsourcing method may provide surgical educators and medical institutions with a boundless number of procedural skills assessors to efficiently quantify technical skills for use in trainee advancement and hospital quality improvement.
BACKGROUND: Objective quantification of surgical skill is imperative as we enter a healthcare environment of quality improvement and performance-based reimbursement. The gold standard tools are infrequently used due to time-intensiveness, cost inefficiency, and lack of standard practices. We hypothesized that valid performance scores of surgical skill can be obtained through crowdsourcing. METHODS: Twelve surgeons of varying robotic surgical experience performed live porcine robot-assisted urinary bladder closures. Blinded video-recorded performances were scored by expert surgeon graders and by Amazon's Mechanical Turk crowdsourcing crowd workers using the Global Evaluative Assessment of Robotic Skills tool assessing five technical skills domains. Seven expert graders and 50 unique Mechanical Turkers (each paid $0.75/survey) evaluated each video. Global assessment scores were analyzed for correlation and agreement. RESULTS: Six hundred Mechanical Turkers completed the surveys in less than 5 hours, while seven surgeon graders took 14 days. The duration of video clips ranged from 2 to 11 minutes. The correlation coefficient between the Turkers' and expert graders' scores was 0.95 and Cronbach's Alpha was 0.93. Inter-rater reliability among the surgeon graders was 0.89. CONCLUSION: Crowdsourcing surgical skills assessment yielded rapid inexpensive agreement with global performance scores given by expert surgeon graders. The crowdsourcing method may provide surgical educators and medical institutions with a boundless number of procedural skills assessors to efficiently quantify technical skills for use in trainee advancement and hospital quality improvement.
Authors: Eric Heim; Tobias Roß; Alexander Seitel; Keno März; Bram Stieltjes; Matthias Eisenmann; Johannes Lebert; Jasmin Metzger; Gregor Sommer; Alexander W Sauter; Fides Regina Schwartz; Andreas Termer; Felix Wagner; Hannes Götz Kenngott; Lena Maier-Hein Journal: J Med Imaging (Bellingham) Date: 2018-09-08
Authors: Shanley B Deal; Dimitrios Stefanidis; Dana Telem; Robert D Fanelli; Marian McDonald; Michael Ujiki; L Michael Brunt; Adnan A Alseidi Journal: Surg Endosc Date: 2017-04-25 Impact factor: 4.584
Authors: Andrew J Hung; Thomas Bottyan; Thomas G Clifford; Sarfaraz Serang; Zein K Nakhoda; Swar H Shah; Hana Yokoi; Monish Aron; Inderbir S Gill Journal: World J Urol Date: 2016-04-22 Impact factor: 4.226
Authors: Geb W Thomas; Steven Long; Marcus Tatum; Timothy Kowalewski; Dominik Mattioli; J Lawrence Marsh; Heather R Kowalski; Matthew D Karam; Joan E Bechtold; Donald D Anderson Journal: Iowa Orthop J Date: 2020