Nava Aghdasi1, Randall Bly2, Lee W White3, Blake Hannaford4, Kris Moe2, Thomas S Lendvay5. 1. Department of Electrical Engineering, University of Washington, Seattle, Washington. Electronic address: navaa@u.washington.edu. 2. Department of Otolaryngology - Head and Neck Surgery, University of Washington, Seattle, Washington. 3. M.D.C. School of Medicine, Stanford University, Stanford, California. 4. Department of Electrical Engineering, University of Washington, Seattle, Washington. 5. Department of Urology, Seattle Children's Hospital Seattle, Washington.
Abstract
BACKGROUND: Objective assessment of surgical skills is resource intensive and requires valuable time of expert surgeons. The goal of this study was to assess the ability of a large group of laypersons using a crowd-sourcing tool to grade a surgical procedure (cricothyrotomy) performed on a simulator. The grading included an assessment of the entire procedure by completing an objective assessment of technical skills survey. MATERIALS AND METHODS: Two groups of graders were recruited as follows: (1) Amazon Mechanical Turk users and (2) three expert surgeons from University of Washington Department of Otolaryngology. Graders were presented with a video of participants performing the procedure on the simulator and were asked to grade the video using the objective assessment of technical skills questions. Mechanical Turk users were paid $0.50 for each completed survey. It took 10 h to obtain all responses from 30 Mechanical Turk users for 26 training participants (26 videos/tasks), whereas it took 60 d for three expert surgeons to complete the same 26 tasks. RESULTS: The assessment of surgical performance by a group (n = 30) of laypersons matched the assessment by a group (n = 3) of expert surgeons with a good level of agreement determined by Cronbach alpha coefficient = 0.83. CONCLUSIONS: We found crowd sourcing was an efficient, accurate, and inexpensive method for skills assessment with a good level of agreement to experts' grading.
BACKGROUND: Objective assessment of surgical skills is resource intensive and requires valuable time of expert surgeons. The goal of this study was to assess the ability of a large group of laypersons using a crowd-sourcing tool to grade a surgical procedure (cricothyrotomy) performed on a simulator. The grading included an assessment of the entire procedure by completing an objective assessment of technical skills survey. MATERIALS AND METHODS: Two groups of graders were recruited as follows: (1) Amazon Mechanical Turk users and (2) three expert surgeons from University of Washington Department of Otolaryngology. Graders were presented with a video of participants performing the procedure on the simulator and were asked to grade the video using the objective assessment of technical skills questions. Mechanical Turk users were paid $0.50 for each completed survey. It took 10 h to obtain all responses from 30 Mechanical Turk users for 26 training participants (26 videos/tasks), whereas it took 60 d for three expert surgeons to complete the same 26 tasks. RESULTS: The assessment of surgical performance by a group (n = 30) of laypersons matched the assessment by a group (n = 3) of expert surgeons with a good level of agreement determined by Cronbach alpha coefficient = 0.83. CONCLUSIONS: We found crowd sourcing was an efficient, accurate, and inexpensive method for skills assessment with a good level of agreement to experts' grading.
Authors: Daniel Holst; Timothy M Kowalewski; Lee W White; Timothy C Brand; Jonathan D Harper; Mathew D Sorenson; Sarah Kirsch; Thomas S Lendvay Journal: J Endourol Date: 2015-01-07 Impact factor: 2.942
Authors: Grace L Paley; Rebecca Grove; Tejas C Sekhar; Jack Pruett; Michael V Stock; Tony N Pira; Steven M Shields; Evan L Waxman; Bradley S Wilson; Mae O Gordon; Susan M Culican Journal: J Surg Educ Date: 2021-02-25 Impact factor: 2.891
Authors: Mirco Friedrich; Julian Ober; Patrick Haubruck; Christian Bergdolt; Thomas Bruckner; Karl-Friedrich Kowalewski; Martina Kadmon; Beat-Peter Müller-Stich; Michael Christopher Tanner; Felix Nickel Journal: GMS J Med Educ Date: 2018-11-15
Authors: D Andrew Tompkins; Patrick S Johnson; Michael T Smith; Eric C Strain; Robert R Edwards; Matthew W Johnson Journal: Pain Date: 2016-08 Impact factor: 7.926
Authors: Perrine Créquit; Ghizlène Mansouri; Mehdi Benchoufi; Alexandre Vivot; Philippe Ravaud Journal: J Med Internet Res Date: 2018-05-15 Impact factor: 5.428