Literature DB >> 26984551

Device- and system-independent personal touchless user interface for operating rooms : One personal UI to control all displays in an operating room.

Meng Ma1,2, Pascal Fallavollita3, Séverine Habert3, Simon Weidert4, Nassir Navab3,5.   

Abstract

INTRODUCTION: In the modern day operating room, the surgeon performs surgeries with the support of different medical systems that showcase patient information, physiological data, and medical images. It is generally accepted that numerous interactions must be performed by the surgical team to control the corresponding medical system to retrieve the desired information. Joysticks and physical keys are still present in the operating room due to the disadvantages of mouses, and surgeons often communicate instructions to the surgical team when requiring information from a specific medical system. In this paper, a novel user interface is developed that allows the surgeon to personally perform touchless interaction with the various medical systems, switch effortlessly among them, all of this without modifying the systems' software and hardware.
METHODS: To achieve this, a wearable RGB-D sensor is mounted on the surgeon's head for inside-out tracking of his/her finger with any of the medical systems' displays. Android devices with a special application are connected to the computers on which the medical systems are running, simulating a normal USB mouse and keyboard. When the surgeon performs interaction using pointing gestures, the desired cursor position in the targeted medical system display, and gestures, are transformed into general events and then sent to the corresponding Android device. Finally, the application running on the Android devices generates the corresponding mouse or keyboard events according to the targeted medical system. RESULTS AND
CONCLUSION: To simulate an operating room setting, our unique user interface was tested by seven medical participants who performed several interactions with the visualization of CT, MRI, and fluoroscopy images at varying distances from them. Results from the system usability scale and NASA-TLX workload index indicated a strong acceptance of our proposed user interface.

Entities:  

Keywords:  Finger pointing gesture; Multimodal interaction; Operating room; User interface

Mesh:

Year:  2016        PMID: 26984551     DOI: 10.1007/s11548-016-1375-6

Source DB:  PubMed          Journal:  Int J Comput Assist Radiol Surg        ISSN: 1861-6410            Impact factor:   2.924


  7 in total

1.  A non-contact mouse for surgeon-computer interaction.

Authors:  C Grätzel; T Fong; S Grange; C Baur
Journal:  Technol Health Care       Date:  2004       Impact factor: 1.285

2.  Learning gestures for customizable human-computer interaction in the operating room.

Authors:  Loren Arthur Schwarz; Ali Bigdelou; Nassir Navab
Journal:  Med Image Comput Comput Assist Interv       Date:  2011

3.  You can't touch this: touch-free navigation through radiological images.

Authors:  Lars C Ebert; Gary Hatch; Garyfalia Ampanozi; Michael J Thali; Steffen Ross
Journal:  Surg Innov       Date:  2011-11-06       Impact factor: 2.058

4.  Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field.

Authors:  Matt Strickland; Jamie Tremaine; Greg Brigley; Calvin Law
Journal:  Can J Surg       Date:  2013-06       Impact factor: 2.089

5.  Detection of coronary artery anomalies in infants and young children with congenital heart disease by using MR imaging.

Authors:  Tarinee Tangcharoen; Aaron Bell; Sanjeet Hegde; Tarique Hussain; Philipp Beerbaum; Tobias Schaeffter; Reza Razavi; Rene M Botnar; Gerald F Greil
Journal:  Radiology       Date:  2011-02-15       Impact factor: 11.105

6.  Informatics in Radiology: developing a touchless user interface for intraoperative image control during interventional radiology procedures.

Authors:  Justin H Tan; Cherng Chao; Mazen Zawaideh; Anne C Roberts; Thomas B Kinney
Journal:  Radiographics       Date:  2012-12-21       Impact factor: 5.333

7.  Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report.

Authors:  Guillermo M Rosa; María L Elizondo
Journal:  Imaging Sci Dent       Date:  2014-06-11
  7 in total
  6 in total

1.  Comparison of gesture and conventional interaction techniques for interventional neuroradiology.

Authors:  Julian Hettig; Patrick Saalfeld; Maria Luz; Mathias Becker; Martin Skalej; Christian Hansen
Journal:  Int J Comput Assist Radiol Surg       Date:  2017-01-24       Impact factor: 2.924

Review 2.  Touchless interaction with software in interventional radiology and surgery: a systematic literature review.

Authors:  André Mewes; Bennet Hensen; Frank Wacker; Christian Hansen
Journal:  Int J Comput Assist Radiol Surg       Date:  2016-09-19       Impact factor: 2.924

3.  Touchless Control of Picture Archiving and Communication System in Operating Room Environment: A Comparative Study of Input Methods.

Authors:  Jung-Taek Kim; Yong-Han Cha; Jun-Il Yoo; Chan-Ho Park
Journal:  Clin Orthop Surg       Date:  2021-08-17

4.  Introducing a brain-computer interface to facilitate intraoperative medical imaging control - a feasibility study.

Authors:  Hooman Esfandiari; Pascal Troxler; Sandro Hodel; Daniel Suter; Mazda Farshad; Philipp Fürnstahl
Journal:  BMC Musculoskelet Disord       Date:  2022-07-22       Impact factor: 2.562

5.  Role of Intelligent Management Systems in Surgical Punctuality and Quality of Care.

Authors:  Gendi Li; Shenhui Huang
Journal:  Comput Intell Neurosci       Date:  2022-10-11

6.  Three-dimensional holographic visualization of high-resolution myocardial scar on HoloLens.

Authors:  Jihye Jang; Cory M Tschabrunn; Michael Barkagan; Elad Anter; Bjoern Menze; Reza Nezafat
Journal:  PLoS One       Date:  2018-10-08       Impact factor: 3.240

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.