BACKGROUND: Eye-tracking technology has been shown to improve trainee performance in the aircraft industry, radiology, and surgery. The ability to track the point-of-regard of a supervisor and reflect this onto a subjects' laparoscopic screen to aid instruction of a simulated task is attractive, in particular when considering the multilingual make up of modern surgical teams and the development of collaborative surgical techniques. We tried to develop a bespoke interface to project a supervisors' point-of-regard onto a subjects' laparoscopic screen and to investigate whether using the supervisor's eye-gaze could be used as a tool to aid the identification of a target during a surgical-simulated task. METHODS: We developed software to project a supervisors' point-of-regard onto a subjects' screen whilst undertaking surgically related laparoscopic tasks. Twenty-eight subjects with varying levels of operative experience and proficiency in English undertook a series of surgically minded laparoscopic tasks. Subjects were instructed with verbal queues (V), a cursor reflecting supervisor's eye-gaze (E), or both (VE). Performance metrics included time to complete tasks, eye-gaze latency, and number of errors. RESULTS: Completion times and number of errors were significantly reduced when eye-gaze instruction was employed (VE, E). In addition, the time taken for the subject to correctly focus on the target (latency) was significantly reduced. CONCLUSIONS: We have successfully demonstrated the effectiveness of a novel framework to enable a supervisor eye-gaze to be projected onto a trainee's laparoscopic screen. Furthermore, we have shown that utilizing eye-tracking technology to provide visual instruction improves completion times and reduces errors in a simulated environment. Although this technology requires significant development, the potential applications are wide-ranging.
BACKGROUND: Eye-tracking technology has been shown to improve trainee performance in the aircraft industry, radiology, and surgery. The ability to track the point-of-regard of a supervisor and reflect this onto a subjects' laparoscopic screen to aid instruction of a simulated task is attractive, in particular when considering the multilingual make up of modern surgical teams and the development of collaborative surgical techniques. We tried to develop a bespoke interface to project a supervisors' point-of-regard onto a subjects' laparoscopic screen and to investigate whether using the supervisor's eye-gaze could be used as a tool to aid the identification of a target during a surgical-simulated task. METHODS: We developed software to project a supervisors' point-of-regard onto a subjects' screen whilst undertaking surgically related laparoscopic tasks. Twenty-eight subjects with varying levels of operative experience and proficiency in English undertook a series of surgically minded laparoscopic tasks. Subjects were instructed with verbal queues (V), a cursor reflecting supervisor's eye-gaze (E), or both (VE). Performance metrics included time to complete tasks, eye-gaze latency, and number of errors. RESULTS: Completion times and number of errors were significantly reduced when eye-gaze instruction was employed (VE, E). In addition, the time taken for the subject to correctly focus on the target (latency) was significantly reduced. CONCLUSIONS: We have successfully demonstrated the effectiveness of a novel framework to enable a supervisor eye-gaze to be projected onto a trainee's laparoscopic screen. Furthermore, we have shown that utilizing eye-tracking technology to provide visual instruction improves completion times and reduces errors in a simulated environment. Although this technology requires significant development, the potential applications are wide-ranging.
Authors: Jeffrey H Peters; Gerald M Fried; Lee L Swanstrom; Nathaniel J Soper; Lelan F Sillin; Bruce Schirmer; Kaaren Hoffman Journal: Surgery Date: 2004-01 Impact factor: 3.982
Authors: Lee Richstone; Michael J Schwartz; Casey Seideman; Jeffrey Cadeddu; Sandra Marshall; Louis R Kavoussi Journal: Ann Surg Date: 2010-07 Impact factor: 12.969
Authors: Hong-En Chen; Cheyenne C Sonntag; David F Pepley; Rohan S Prabhu; David C Han; Jason Z Moore; Scarlett R Miller Journal: Am J Surg Date: 2018-11-13 Impact factor: 2.565
Authors: Kevin K John; Jakob D Jensen; Andy J King; Manusheela Pokharel; Douglas Grossman Journal: J Dermatol Sci Date: 2018-04-06 Impact factor: 4.563
Authors: Samuel J Vine; John S McGrath; Elizabeth Bright; Thomas Dutton; James Clark; Mark R Wilson Journal: Surg Endosc Date: 2014-01-11 Impact factor: 4.584
Authors: Chuhao Wu; Jackie Cha; Jay Sulek; Chandru P Sundaram; Juan Wachs; Robert W Proctor; Denny Yu Journal: Appl Ergon Date: 2020-09-19 Impact factor: 3.661
Authors: David Black; Michael Unger; Nele Fischer; Ron Kikinis; Horst Hahn; Thomas Neumuth; Bernhard Glaser Journal: Int J Comput Assist Radiol Surg Date: 2017-10-27 Impact factor: 2.924
Authors: Ka-Wai Kwok; Loi-Wah Sun; George P Mylonas; David R C James; Felipe Orihuela-Espina; Guang-Zhong Yang Journal: Ann Biomed Eng Date: 2012-05-12 Impact factor: 3.934
Authors: Daniel R Leff; David R C James; Felipe Orihuela-Espina; Ka-Wai Kwok; Loi Wah Sun; George Mylonas; Thanos Athanasiou; Ara W Darzi; Guang-Zhong Yang Journal: Front Hum Neurosci Date: 2015-10-14 Impact factor: 3.169