Literature DB >> 23580026

Speech and motion control for interventional radiology: requirements and feasibility.

Andreas M Hötker1, Michael B Pitton, Peter Mildenberger, Christoph Düber.   

Abstract

PURPOSE: Interventional radiology is performed in a sterile environment where speech and motion control of image review is needed to simplify and expedite routine procedures. The requirements and limitations were defined by testing an interventional radiology test bed speech and motion control system.
METHODS: Motion control software was implemented using the Microsoft[Formula: see text] Kinect[Formula: see text] (Microsoft Corp., USA) framework. The system was tested by 10 participants using a predefined set of six voice and six gesture commands under different lighting conditions to assess the influence of illumination on command recognition. The participants rated the convenience of the application and its possible use in everyday clinical routine. A basic set of voice or gesture commands required for interventional radiology were identified.
RESULTS: The majority (93 %) of commands were recognized successfully. Speech commands were less prone to errors than gesture commands. Unwanted side effects occurred (e.g., accidentally issuing a gesture command) in about 30 % of cases. Dimmed lighting conditions did not have a measurable effect on the recognition rate. Six out of 10 participants would consider using the application in everyday routine. The necessary voice/gesture commands for interventional radiology were identified and integrated into the control system.
CONCLUSION: Speech and motion control of image review provides a new man-machine interface for radiological image handling that is especially useful in sterile environments due to no-touch navigation. Command recognition rates were high and remained stable under different lighting conditions. However, the rate of accidental triggering due to unintended commands should be reduced.

Entities:  

Mesh:

Year:  2013        PMID: 23580026     DOI: 10.1007/s11548-013-0841-7

Source DB:  PubMed          Journal:  Int J Comput Assist Radiol Surg        ISSN: 1861-6410            Impact factor:   2.924


  6 in total

1.  You can't touch this: touch-free navigation through radiological images.

Authors:  Lars C Ebert; Gary Hatch; Garyfalia Ampanozi; Michael J Thali; Steffen Ross
Journal:  Surg Innov       Date:  2011-11-06       Impact factor: 2.058

2.  A gesture-based tool for sterile browsing of radiology images.

Authors:  Juan P Wachs; Helman I Stern; Yael Edan; Michael Gillam; Jon Handler; Craig Feied; Mark Smith
Journal:  J Am Med Inform Assoc       Date:  2008 May-Jun       Impact factor: 4.497

3.  A portable immersive surgery training system using RGB-D sensors.

Authors:  Xinqing Guo; Luis D Lopez; Zhan Yu; Karl V Steiner; Kenneth E Barner; Thomas L Bauer; Jingyi Yu
Journal:  Stud Health Technol Inform       Date:  2013

4.  Microsoft Kinect based head tracking for Life Size Collaborative Surgical Simulation Environments (LS-CollaSSLE).

Authors:  Saurabh Dargar; Austin Nunno; Ganesh Sankaranarayanan; Suvranu De
Journal:  Stud Health Technol Inform       Date:  2013

5.  Using the Microsoft Kinect for patient size estimation and radiation dose normalization: proof of concept and initial validation.

Authors:  Tessa S Cook; Gregory Couch; Timothy J Couch; Woojin Kim; William W Boonn
Journal:  J Digit Imaging       Date:  2013-08       Impact factor: 4.056

6.  Touchless gesture user interface for interactive image visualization in urological surgery.

Authors:  Guilherme Cesar Soares Ruppert; Leonardo Oliveira Reis; Paulo Henrique Junqueira Amorim; Thiago Franco de Moraes; Jorge Vicente Lopes da Silva
Journal:  World J Urol       Date:  2012-05-12       Impact factor: 4.226

  6 in total
  6 in total

1.  A gesture-controlled projection display for CT-guided interventions.

Authors:  A Mewes; P Saalfeld; O Riabikin; M Skalej; C Hansen
Journal:  Int J Comput Assist Radiol Surg       Date:  2015-05-10       Impact factor: 2.924

2.  Comparative efficacy of new interfaces for intra-procedural imaging review: the Microsoft Kinect, Hillcrest Labs Loop Pointer, and the Apple iPad.

Authors:  Cherng Chao; Justin Tan; Edward M Castillo; Mazen Zawaideh; Anne C Roberts; Thomas B Kinney
Journal:  J Digit Imaging       Date:  2014-08       Impact factor: 4.056

Review 3.  Touchless interaction with software in interventional radiology and surgery: a systematic literature review.

Authors:  André Mewes; Bennet Hensen; Frank Wacker; Christian Hansen
Journal:  Int J Comput Assist Radiol Surg       Date:  2016-09-19       Impact factor: 2.924

4.  Touchless Control of Picture Archiving and Communication System in Operating Room Environment: A Comparative Study of Input Methods.

Authors:  Jung-Taek Kim; Yong-Han Cha; Jun-Il Yoo; Chan-Ho Park
Journal:  Clin Orthop Surg       Date:  2021-08-17

5.  Gesture-Controlled Image Management for Operating Room: A Randomized Crossover Study to Compare Interaction Using Gestures, Mouse, and Third Person Relaying.

Authors:  Rolf Wipfli; Victor Dubois-Ferrière; Sylvain Budry; Pierre Hoffmeyer; Christian Lovis
Journal:  PLoS One       Date:  2016-04-15       Impact factor: 3.240

6.  Use of Commercial Off-The-Shelf Devices for the Detection of Manual Gestures in Surgery: Systematic Literature Review.

Authors:  Fernando Alvarez-Lopez; Marcelo Fabián Maina; Francesc Saigí-Rubió
Journal:  J Med Internet Res       Date:  2019-04-14       Impact factor: 5.428

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.