Arun Nemani1, Uwe Kruger1, Clairice A Cooper2, Steven D Schwaitzberg2, Xavier Intes1, Suvranu De3. 1. Rensselaer Polytechnic Institute, 110, 8th Street, Troy, NY, 12180, USA. 2. University at Buffalo School of Medicine and Biomedical Sciences, Buffalo, NY, 14228, USA. 3. Rensselaer Polytechnic Institute, 110, 8th Street, Troy, NY, 12180, USA. des@rpi.edu.
Abstract
BACKGROUND: Physical and virtual surgical simulators are increasingly being used in training technical surgical skills. However, metrics such as completion time or subjective performance checklists often show poor correlation to transfer of skills into clinical settings. We hypothesize that non-invasive brain imaging can objectively differentiate and classify surgical skill transfer, with higher accuracy than established metrics, for subjects based on motor skill levels. STUDY DESIGN: 18 medical students at University at Buffalo were randomly assigned into control, physical surgical trainer, or virtual trainer groups. Training groups practiced a surgical technical task on respective simulators for 12 consecutive days. To measure skill transfer post-training, all subjects performed the technical task in an ex-vivo environment. Cortical activation was measured using functional near-infrared spectroscopy (fNIRS) in the prefrontal cortex, primary motor cortex, and supplementary motor area, due to their direct impact on motor skill learning. RESULTS: Classification between simulator trained and untrained subjects based on traditional metrics is poor, where misclassification errors range from 20 to 41%. Conversely, fNIRS metrics can successfully classify physical or virtual trained subjects from untrained subjects with misclassification errors of 2.2% and 8.9%, respectively. More importantly, untrained subjects are successfully classified from physical or virtual simulator trained subjects with misclassification errors of 2.7% and 9.1%, respectively. CONCLUSION: fNIRS metrics are significantly more accurate than current established metrics in classifying different levels of surgical motor skill transfer. Our approach brings robustness, objectivity, and accuracy in validating the effectiveness of future surgical trainers in translating surgical skills to clinically relevant environments.
BACKGROUND: Physical and virtual surgical simulators are increasingly being used in training technical surgical skills. However, metrics such as completion time or subjective performance checklists often show poor correlation to transfer of skills into clinical settings. We hypothesize that non-invasive brain imaging can objectively differentiate and classify surgical skill transfer, with higher accuracy than established metrics, for subjects based on motor skill levels. STUDY DESIGN: 18 medical students at University at Buffalo were randomly assigned into control, physical surgical trainer, or virtual trainer groups. Training groups practiced a surgical technical task on respective simulators for 12 consecutive days. To measure skill transfer post-training, all subjects performed the technical task in an ex-vivo environment. Cortical activation was measured using functional near-infrared spectroscopy (fNIRS) in the prefrontal cortex, primary motor cortex, and supplementary motor area, due to their direct impact on motor skill learning. RESULTS: Classification between simulator trained and untrained subjects based on traditional metrics is poor, where misclassification errors range from 20 to 41%. Conversely, fNIRS metrics can successfully classify physical or virtual trained subjects from untrained subjects with misclassification errors of 2.2% and 8.9%, respectively. More importantly, untrained subjects are successfully classified from physical or virtual simulator trained subjects with misclassification errors of 2.7% and 9.1%, respectively. CONCLUSION: fNIRS metrics are significantly more accurate than current established metrics in classifying different levels of surgical motor skill transfer. Our approach brings robustness, objectivity, and accuracy in validating the effectiveness of future surgical trainers in translating surgical skills to clinically relevant environments.
Authors: Benjamin K Poulose; Melina C Vassiliou; Brian J Dunkin; John D Mellinger; Robert D Fanelli; Jose M Martinez; Jeffrey W Hazey; Lelan F Sillin; Conor P Delaney; Vic Velanovich; Gerald M Fried; James R Korndorffer; Jeffrey M Marks Journal: Surg Endosc Date: 2013-10-08 Impact factor: 4.584
Authors: Daniel J Scott; Carla M Pugh; E Matthew Ritter; Lenworth M Jacobs; Carlos A Pellegrini; Ajit K Sachdeva Journal: Surgery Date: 2011-02-05 Impact factor: 3.982
Authors: Daniel Richard Leff; Felipe Orihuela-Espina; Clare E Elwell; Thanos Athanasiou; David T Delpy; Ara W Darzi; Guang-Zhong Yang Journal: Neuroimage Date: 2010-10-26 Impact factor: 6.556
Authors: Venkata S Arikatla; Ganesh Sankaranarayanan; Woojin Ahn; Amine Chellali; Suvranu De; G L Caroline; John Hwabejire; Marc DeMoya; Steven Schwaitzberg; Daniel B Jones Journal: Surg Endosc Date: 2012-12-14 Impact factor: 4.584
Authors: Howard C H Khoe; Jun Wei Low; Sujith Wijerathne; Lui Su Ann; Hrishikesh Salgaonkar; Davide Lomanto; JongKwan Choi; JiYeong Baek; Wilson W Tam; Ho Pei; Roger C M Ho Journal: Surg Endosc Date: 2020-01-17 Impact factor: 4.584