| Literature DB >> 31066679 |
Fernando Alvarez-Lopez1,2, Marcelo Fabián Maina3, Francesc Saigí-Rubió1.
Abstract
BACKGROUND: The increasingly pervasive presence of technology in the operating room raises the need to study the interaction between the surgeon and computer system. A new generation of tools known as commercial off-the-shelf (COTS) devices enabling touchless gesture-based human-computer interaction is currently being explored as a solution in surgical environments.Entities:
Keywords: computer-assisted surgery; education, medical; minimally invasive surgery; operating room; user-computer interface
Mesh:
Year: 2019 PMID: 31066679 PMCID: PMC6533048 DOI: 10.2196/11925
Source DB: PubMed Journal: J Med Internet Res ISSN: 1438-8871 Impact factor: 5.428
Figure 1Flow diagram of studies through the review.
Summary of included studies evaluating Microsoft Kinect.
| Study | Aim | Type of study | Intervention | Sample | Results/Conclusions |
| [ | To describe a system for the interactive exploration of medical images through a gesture-controlled interface using MKa. | Proof-of-concept. | Manipulation of CTb, MRIc and Positron emission tomography images. | Not described. | As the interface does not require direct contact or calibration, it is suitable for use in the operating room. |
| [ | To explore the potential simplifications derived from using 3Dd sensors in medical augmented reality applications by designing a low-cost system. | Proof-of-concept. | Augmented reality in Medicine. | Not described. | The concept is feasible but the whole process is still too time-consuming to be executed in real time. |
| [ | To present an augmented reality magic mirror for anatomy teaching. | Proof-of-concept. | Augmented reality in Medicine. Anatomy education. | A hospital and a school. | The system can be used for educational purposes, to improve communication between doctor and patients. A possible use for anatomy teaching in surgery is not mentioned. |
| [ | To evaluate the response time and usability (gestures and voice commands) compared with mouse and keyboard controls. | Prototype user testing and feasibility testing. | Manipulation of CT images. | 2 radiologists and 8 forensic pathologists who recreated 12 images. | Users took 1.4 times longer to recreate an image with gesture control and rated the system 3.4 out of 5 for ease of use in comparison with the keyboard and mouse. The voice recognition system did not work properly. |
| [ | To develop a system to allow the surgeon to interact with the standard PACS system during sterile surgical management of orthopedic patients. | Proof-of-concept. | Manipulation of radiological images in orthopedics. | Not described. | This is the first example of this technology being used to control digital X-rays in clinical practice. |
| [ | To present a sterile method for the surgeon to manipulate images using touchless freehand gestures. | Experiment. | Manipulation of MRI images. | 9 veterinary surgeons. 22 students. | The hypothesis that contextual information integrated with hand trajectory gesture information can significantly improve the overall recognition system performance was validated. The recognition accuracy was 98.7% |
| [ | To evaluate an MK-based interaction system for manipulating imaging data using ‘Magic Lens visualization.‘ | Proof-of-concept in the operating room. | Manipulation of radiological images. | A laryngoplasty. | The surgeon can manipulate the preoperative information with the intraoperative video and the simulations to correctly place the implant. |
| [ | To compare the accuracy and speed of interaction of MK with that of a mouse. To study the performance of the interaction methods in rotation tasks and localization of internal structures in a 3D dataset. | User testing. | Manipulation of radiological images. | 15 users. | The gesture-based interface outperformed the traditional mouse with respect to time and accuracy in the orientation and rotation task. The mouse was superior in terms of accuracy of localization of internal structures. However, the gesture-based interface was found to have the fastest target localization time. |
| [ | To develop a user-friendly touchless system for controlling the presentation of medical images based on hand gesture recognition in the operating room. | Proof-of-concept in the operating room. | Manipulation of radiological images in orthopedic surgery. | Not described. | The system does not require calibration and was adapted to the surgical environment following the principles of asepsis/antisepsis. |
| [ | To present a touchless gesture interface that allows the surgeon to control medical images using hand gestures. | Proof-of-concept and prototype feasibility testing. | Manipulation of CT images. | Enucleation of 4 tumors in 3 urology patients. | First description in the literature of a gesture user interface using MK in the operating room in in-vivo surgery, showing that it is an efficient and low-cost solution. |
| [ | To develop a low-cost augmented reality interface projected onto a mannequin simulator. | Proof-of-concept. | Augmented reality for education in Medicine. | A physical simulator, video projector, Wii Remote and MK. | The manipulations obtained using MK were similar to those described with the Wii. |
| [ | To develop a version of a gesture-based system for controlling images. | Proof-of-concept. | Manipulation of MRI images. | Resection of a glioma. | Except for the scanning movement, each movement was recognized with great accuracy. The algorithm can be installed in the clinical area. |
| [ | To use MK to operate an automated operating-room light system. | Prototype user testing. | Manipulation of operating room lights. | 18 volunteers. | The gestures were easy to learn and the movement of the light beam was sufficiently precise. |
| [ | To create a touchless head tracking system for an immersive virtual operating room. | Proof-of-concept. | Virtual reality for simulation and education in surgery. | A 3D virtual operating room with a virtual operating table. | Using MK, it was possible to implement a very accurate interactive tracking system regardless of the complexity of the virtual reality system. |
| [ | To present a new prototype that allows the user to control the OsiriX system with finger gestures using a low-cost depth camera. | Proof-of-concept and prototype feasibility testing. | Manipulation of CT images. | 4 forensic pathologists, 1 radiologist and 1 engineer. | On average, 4.5 min were required to learn to use the system. |
| [ | To present a new immersive surgical training system. | Proof-of-concept and prototype fidelity testing. | Virtual reality for education in surgery. | Cholecystectomy training on animal tissue blocks. | Initial feedback from the residents showed that the system is much more effective than the conventional videotaped system. |
| [ | To test a speech and gesture-controlled interventional radiology system. | User testing. | Manipulation of CT and angiography images. | 10 radiology residents used commands under different lighting conditions during 18 angiographies and 10 CT- guided punctures. | 93% of commands were recognized successfully. Speech commands were less prone to errors than gesture commands. 60% of participants would use the application in their routine clinical practice. |
| [ | To develop an image operation system for image manipulation using a motion sensor. | Proof-of-concept. | Manipulation of angiographic images. | Not described. | The system can be implemented as a useful tool in angiography for controlling image viewing using gestures in the operating room. |
| [ | The working hypothesis is that contextual information such as the focus of attention, integrated with gestural information, can significantly improve overall system recognition performance compared with interfaces relying on gesture recognition alone. | Ethnographic study. Experiment. Survey. | Manipulation of MRI images. | 10 veterinary surgeons. 20 volunteers. | The surgeon’s intention to perform a gesture can be accurately recognized by observing environmental cues (context). The hypothesis was validated by a drop in the false positive rate of gesture recognition from 20.76% to 2.33%. A significant rate of reduction of the mean task completion time indicated that the user operates the interface more efficiently with experience. The tracking algorithm occasionally failed in the presence of several people in the camera’s field of view. |
| [ | To examine the functionality and usability of MK to complete the visualization of 3D anatomical images. | User testing. Survey. | Manipulation of anatomical images. | 32 participants: Medical students, professors and anatomy laboratory staff. | MK users reached accuracy levels almost identical to those who used a mouse, and spent less time on performing the same tasks. MK showed potential as a device for interaction with medical images. |
| [ | To examine usability for navigating through 3D medical images using MK compared with a traditional mouse. | User testing. Survey. | Manipulation of anatomical images. Education. | 17 veterinary students. | Improvements should be made to MK before it can be implemented as a device for medical use. The preferred method was the mouse. MK has the potential to reduce time on the task. |
| [ | To develop a prototype and to examine the feasibility of this new device to help bridge the sterility barrier and eliminate the time and space gap that exists between image review and visual correlation with real-time operative field anatomy. | Proof-of-concept and prototype feasibility testing. | Manipulation of CT and MRI images. | 2 MISe procedures and 4 open procedures performed by a surgeon. | The system worked well in a wide range of lighting conditions and procedures. There was an increase in the use of intraoperative image consultation. The gesture library was intuitive and easy to learn. Gestures were mastered within 10 min. |
| [ | To investigate a solution for manipulating medical images using MK. | Proof-of-concept and prototype feasibility testing. | Manipulation of CT images. | 29 radiologists (diagnostic and interventional). | The potential of the device to enhance image-guided treatment in an interventional radiology suite while maintaining a sterile surgical field was demonstrated. 69% of those surveyed believed that the device could be useful in the interventional radiology field. |
| [ | To investigate the need for posture and position training during bronchoscopy using a tool called ETrack | Pilot study. | Analysis of the operator’s movements during a bronchoscopy. Education. | Not described. | The results highlight the importance of posture during bronchoscopy and the need to implement a training module for the simulator. |
| [ | To evaluate a new touchless, portable, low-cost 3D measurement system for objective breast assessment. | Concurrent validation study. | Calculation of breast implant volumes. | 9 silicone implants of known volumes. | The implant volumes were calculated with an error margin of 10%. Reproducibility was satisfactory. The system was validated for clinical use. |
| [ | To describe a gesture-controlled 3D teaching tool in which temporal bone anatomy is manipulated without using a mouse or keyboard. To provide a teaching tool for patient-specific anatomy. | Proof-of-concept. | Manipulation of anatomical images. Education. | 0.15 mm slice thickness cadaveric temporal bone images. | The interactive 3D model developed seems promising as an educational tool. |
| [ | To develop hand recognition software based on MK, linked to an interventional CT, to manipulate images. | Feasibility testing | Manipulation of CT images in surgery. | 10 interventional radiology procedures. 1 operator. | Tested on 10 procedures, feasibility was 100%. The system also allowed information to be obtained without using the CT system interface or a third party, and without the loss of operator sterility. |
| [ | To present a novel method for training intentional and nonintentional gesture recognition. | Experiment. | Performance of a simulated brain biopsy on a mannequin assisted by images manipulated using gestures. | 19 subjects. | Continuous gesture recognition was successful 92.26% of the time with a reliability of 89.97%. Significant improvements in task completion time were obtained through the context integration effect. |
| [ | To evaluate 2 contactless hand tracking systems, the LMCf and MK, for their potential to control surgical robots. | Experiment. | Manipulation of robots in surgery. | 4 trained surgeons. | Neither system has the high level of accuracy and robustness that would be required for controlling medical robots. |
| [ | To use a projector for visualization and to provide intuitive means for direct interaction with the information projected onto the surgical surface, using MK to capture the interaction zone and the surgeon’s actions on a deformable surface. | Proof-of-concept. | Augmented reality in surgery. | Not described. | The system eliminates the need for the surgeon to look at a location other than the surgical field. It therefore removes distractions and enhances his or her performance. It not only provides the surgeon with medical data during the intervention, but also allows interaction with such information by using gestures. |
| [ | To present an ethnographic study of a system based on MK developed to allow touchless control of medical images during vascular surgery. The study aims to go beyond demonstrating technical feasibility in order to understand the collaborative practices that emerge from its use in this context. | Ethnographic study. | Manipulation of radiological images. | Endovascular suite of a large hospital. | With touchless interaction, the visual resources were embedded and made meaningful in the collaborative practices of surgery. The importance of direct and dynamic control of the images by the clinicians in the context of talks and in the context of other artefact use is discussed. |
| [ | To evaluate a system for manipulating an operating table using gestures. | Prototype user testing. | Manipulation of an operating table. | 15 participants. | Major problems were encountered during gesture recognition and with obstruction by other people in the interaction area due to the size and layout of the operating room. The system cannot yet be integrated into a surgical environment. |
| [ | To study the technical skills of colonoscopists using MK for motion analysis to develop a tool to guide colonoscopy education and to select discriminative motion patterns. | Construct validity study. | Analysis of the movements of the operator during a colonoscopy. | 10 experienced and 11 novice endoscopists. | Certain types of metric can be used to discriminate between experienced and novice operators. |
| [ | To develop a 3D surface imaging system and to assess the accuracy and repeatability on a female mannequin. | Interrater reliability study. | Measurement of the surface distances of the breast on a mannequin. | A female mannequin. | MK seems to be a useful and feasible system for capturing 3D images of the breast. There was agreement between the measurements obtained by the system and those taken manually with a measuring tape. |
| [ | To present a new surgical training system. | Proof-of-concept. | Real-time immersive 3D surgical training. Education. | Not described. | Preliminary experiments show that this immersive training system is portable, effective and reliable. |
| [ | To present the development and clinical testing of a device that enables intraoperative control of images with hand gestures during neurosurgical procedures. | Proof-of-concept. Initial clinical testing. | Manipulation of MRI images. | 30 neurosurgical operations. | OPECT demonstrated high effectiveness, simplicity of use and precise recognition of the individual user profile. In all cases, surgeons were satisfied with the performance of the device. |
| [ | To test whether an automatic motion analysis system could be used to explore if there is a correlation in scope movements and the level of experience of the surgeon performing the bronchoscopy. | Construct validity study. Prospective, comparative study. | Analysis of the operator’s movements during a bronchoscopy. Education. | 11 novice, 9 intermediate and 9 experienced bronchoscopy operators performed 3 procedures each on a bronchoscopy simulator. | The motion analysis system could discriminate between different levels of experience. Automatic feedback on correct movements during self-directed training on simulators might help new bronchoscopists learn how to handle the bronchoscope like an expert. |
| [ | To compare 2 commercial motion sensors (MK and the LMC) to manipulate CT images, in terms of their utility, usability, speed, accuracy and user acceptance. | Two-strand sequential observational study. Qualitative and quantitative descriptive field study using a semi-structured questionnaire. | Manipulation of CT images. | 42 participants: radiologists, surgeons and interventional radiologists. | Marginal to average acceptability of the 2 devices. MK was found to be more useful and easier to use, but the LMC was more accurate. Further research is required to establish the design specifications, installation guidelines and user training requirements to ensure successful implementation in clinical areas. |
| [ | To develop an integrated and comprehensive operating room information system compatible with HL7 and DICOM (MediNav). A natural user interface is designed specifically for operating rooms based on MK. | Prototype user testing. | Users tested the application’s various modules. | A prototype system is tested in a live operating room at an Iranian teaching hospital. 30 general surgeries. | The results of usability tests are promising, and indicate that integration of these systems into a complete solution is the key. Touchless natural user interfaces can help to collect and visualize medical information in a comprehensive manner. |
| [ | To propose a novel system to visualize a surgical scene in augmented reality using the different sources of information provided by a C-arm and MK. | Prototype user testing. | Augmented reality in orthopedic surgery. | Simulations of 12 orthopedic procedures. 5 participating clinicians, 3 experienced surgeons, 2 fourth-year medical students. | The system showed promising results with respect to better surgical scene understanding and improved depth perception using augmented reality in simulated orthopedic surgery. |
| [ | To explore 3D perception technologies in the operating room. | Ethnographic. Prototype testing. | Detection of the interaction between operating staff and the robot. | Not described. | The paper described a supervision system for the operating room that enables intention tracking. The system had low latency, good registration accuracy and high tracking reliability, which make it useful for workflow monitoring, tracking and avoiding collisions between medical robots and operating room staff. |
| [ | To use MK and color markers to track the position of MIS instruments in real time. | Comparative study between MK and the SinaSim trainer. | Movement of the instrument to position its tip in 81 holes of a Plexiglas plate on 5 occasions. | 1 user. | Although the new method had inferior accuracy compared with mechanical sensors, its low cost and portability make it a candidate for replacing traditional tracking methods. |
| [ | To compare 3 different interaction modes for image manipulation in a surgery setting: 1) A gesture-controlled approach using MK; 2) verbal instructions to a third party; and 3) direct manipulation using a mouse. | Crossover randomized controlled trial with blocked randomization. | Interaction modes were direct manipulation using a mouse, verbal instructions given to a third party, and gesture-controlled manipulation using MK. | 30 physicians and senior medical students | Under the premise that a mouse cannot be used directly during surgery, gesture-controlled approaches were shown to be superior to verbal instructions for image manipulation. |
| [ | To evaluate the feasibility, validity, and reliability of the training system for motion parameter and ergonomic analyses between different experience levels of surgeons using the NDI Polaris System and MK camera. | Construct validity, concurrent validity and test-retest reliability. Prospective blinded study. | Tying of intra-corporeal MIS knots. | 10 MIS novices, 10 intermediate level and 10 experts. | Validity and reliability of the self-developed sensor and expert model-based MIS training system ‘iSurgeon’ were established. |
| [ | To analyze preoperative breast volume in patients with breast cancer in order to predict implant size for reconstruction. | Exploratory study. | MK was used to acquire 3D images of the patients’ breasts before surgery and after surgery. | 10 patients. | This study showed the feasibility of using fast, simple and inexpensive 3D imaging technology for predicting implant size before surgery, although there were significant technical challenges in determining breast volume by surface imaging. |
| [ | To evaluate the feasibility of using 3 different gesture control sensors (MK, the LMC and the Myo armband) to interact in a sterile manner with preoperative data as well as in settings of an integrated operating room during MIS. | Pilot user study. | 2 hepatectomies and 2 partial nephrectomies on an experimental porcine model. | 3 surgeons. | Natural user interfaces are feasible for directly interacting, in a more intuitive and sterile manner, with preoperative images and integrated operating room functionalities during MIS. The combination of the Myo armband and voice commands provided the most intuitive and accurate natural user interface. |
aMK: Microsoft Kinect.
bCT: Computed Tomography.
cMRI: magnetic resonance imaging.
d3D: 3-dimensional.
eMIS: minimally invasive surgery.
fLMC: Leap Motion Controller.
Clinical areas and types of surgical intervention in which gesture-based commercial off-the-shelf devices were used.
| Clinical areas | Types of surgical intervention | Studies |
| General surgery (N=7) | Intraoperative image control, image-guided minimally invasive surgery (adrenalectomy, pancreatectomy, liver resection, a Whipple procedure, as well as liver and pancreatic cancer and renal carcinoma resection), open and laparoscopic bile duct surgery, cholecystectomy, and hepatectomy and nephrectomy in an animal model. | [ |
| Interventional radiology and angiography (N=7) | Arterial dilatation with balloon and umbrella devices, hepatic arterial chemoembolization and selective internal radiation therapy, abdominal computed tomography, and interventional neuroradiology. | [ |
| Neurosurgery (N=7) | Biopsies, resection of brain gliomas, resection of a meningioma, ventriculostomy, and intraoperative image control. | [ |
| Plastic surgery (N=3) | Measurement of breast implant volumes and measurement of distances on the breast surface. | [ |
| Orthopedics (N=3) | Intraoperative image control. | [ |
| Ear, nose, and throat (N=1) | Laryngoplasty. | [ |
| Urology (N=2) | Enucleation of renal tumors and intraoperative image control. | [ |
Summary of included studies evaluating other devices.
| Study | Device | Aim | Type of study | Intervention | Results/Conclusions |
| [ | Camera with Complementary Metal-Oxide-Semiconductor sensor | To propose an architecture for a real-time multimodal system to provide a touchless user interface in surgery. | Prototype user testing. | Gesture detection in computer-assisted surgery. | The preliminary results show good usability and rapid learning. The average time to click anywhere on the screen was less than 5 seconds. Lighting conditions affected the performance of the system. The surgeon showed strong interest in the system and satisfactorily assessed the use of gestures within the operating room. |
| [ | Webcam | To describe a vision-based system that can interpret gestures in real time to manipulate objects within a medical data visualization environment. | Prototype user testing. | Manipulation of medical data (radiology images and selection of medical records) and movement of objects and windows on the screen. | The system implemented in a sterile environment demonstrated performance rates between 95% and 100%. |
| [ | Canon VC-C4 color camera | To describe a vision-based gesture capture system that interprets gestures in real time to manipulate medical images. | Beta testing during a surgical procedure. Experiment. | A beta test of a system prototype was conducted during a live brain biopsy operation, where neurosurgeons were able to browse through MRIa images of the patient’s brain using the sterile hand gesture interface. | Gesture recognition accuracy was 96%. For every repeat of trials, the task completion time decreased by 28% and the learning curve levelled off at the 10th attempt. The gestures were learned very quickly and there was a significant decrease in the number of excess gestures. Rotation accuracy was reasonable. The surgeons rated the system as easy to use, with a rapid response, and useful in the surgical environment. |
| [ | Canon VC-C4 camera | To evaluate the Gestix system. | Prototype user testing. | Manipulation of MRI images during a neurosurgical biopsy. | The system setup time was 20 min. The surgeons found the Gestix system easy to use, with a rapid response, and easy to learn. The system does not require the use of wearable devices. |
| [ | Interaction with gestures in general | Fieldwork focusing on work practices and interactions in an angiography suite and on understanding the collaborative work practices in terms of image production and use. | Ethnographic study of minimally invasive image-guided procedures within an interventional radiology department. | Manipulation of radiological images. | The paper discusses the implications of the findings in the work environment for touchless interaction technologies, and suggests that these will be of importance in considering new input techniques in other medical settings. |
| [ | Commercial video camera | To describe the development of Gestonurse, a robotic system for surgical instruments. | Proof-of-concept. | Surgical instrumentation using a robot. | 95% of gestures were recognized correctly. The system was only 0.83 seconds slower when compared with the performance of a human instrument handler. |
| [ | Touchless interaction systems in general | To understand and use common practices in the surgical setting from a proxemics point of view to uncover implications for the design of touchless interaction systems. The aim is to think of touchlessness in terms of its spatial properties. What does spatial separation imply for the introduction of the touchless control of medical images? | Ethnographic study. | Field observations of work practices in neurosurgery. | Alternative ideas, such as multiple cameras, are the kind of solution that these findings suggest. Such reflections and considerations can be revealed through careful analysis of the spatial organization of activity and proxemics of particular interaction mechanisms. However, it is very important to study current practice in order to speculate about new systems, because they in turn may alter practice. |
| [ | Webcam | To present a system for tracking the movement of MISb instruments based on an orthogonal webcam system installed in a physical simulator. | Experiment. | Recording the movements of the instrument within an imaginary cube. | The results showed a resolution of 0.616 mm on each axis of work, linearity and repeatability in motion tracking, as well as automatic detection of the 3D position of the tip of the surgical instruments with sufficient accuracy. The system is a low-cost and portable alternative to traditional instrument tracking devices. |
| [ | MK, the LMCc, the Myo armband and voice control | To evaluate the feasibility of using 3 different gesture control sensors (MK, the LMC and the Myo armband) to interact in a sterile manner with preoperative data as well as in settings of an integrated operating room during MIS. | Pilot user study. | 2 hepatectomies and 2 partial nephrectomies on an experimental porcine model. | Natural user interfaces are feasible for directly interacting, in a more intuitive and sterile manner, with preoperative images and integrated operating room functionalities during MIS. The combination of the Myo armband and voice commands provided the most intuitive and accurate natural user interface. |
| [ | The Myo armband and the LMC | To analyze the value of 2 gesture input modalities (the Myo armband and the LMC) versus 2 clinically established methods (task delegation and joystick control). | User study. Comparative study. | Simulating a diagnostic neuroradiological vascular treatment with 2 frequently used interaction tasks in an experimental operating room. | Novel input modalities have the potential to carry out single tasks more efficiently than clinically established methods. |
aMRI: magnetic resonance imaging.
bMIS: minimally invasive surgery.
cLMC: Leap Motion Controller.
Use of gesture-based commercial off-the-shelf devices in surgery.
| Use | Studies | |||
| Image manipulation | [ | |||
| Virtual or augmented reality for educational or interventional purposes (N=16) | [ | |||
| Training in endoscopy (bronchoscopy and colonoscopy; N=3) | [ | |||
| Robotics in surgery and in surgical instrumentation | [ | |||
| Instrument tracking in MISa (N=7) | [ | |||
| Tracking of hand movements during MIS (N=2) | [ | |||
| Tracking of hand movements during open surgical knot tying (N=1) | [ | |||
| Simulation for motor skills learning in MIS | [ | |||
| Using patient-specific 3-dimensional images during MIS in real patients or simulators, and presurgical warm-up | [ | |||
| Ethnographic studies (N=5) | [ | |||
| Measurement of breast implant volumes and measurement of distances on the breast surface (N=3) | [ | |||
| Manipulation of the operating table and lights (N=4) | [ | |||
aMIS: minimally invasive surgery.
Summary of included studies evaluating the Leap Motion Controller.
| Study | Aim | Type of study | Intervention | Sample | Results/Conclusions |
| [ | To evaluate the implementation of a low-cost device for touchless PACS control in an interventional radiology suite. To demonstrate that interaction with gestures can decrease the duration of the procedures, the risk of re-intervention, and improve technical performance. | Proof-of-concept and prototype feasibility testing. | Manipulation of images in interventional radiology. | Interventional radiology suite. | The LMCa is a feasible, portable and low-cost alternative to other touchless PACS interaction systems. A decrease in the need for re-intervention was reported, but no explanation was given of how it was measured. |
| [ | To present the first experience of using new systems for image control in the operating room: the LMC and OsiriX. | Proof-of-concept. | Manipulation of CTb and MRIc images. | 2 general surgeons, 1 urologist, 3 orthopedic surgeons and 2 surgeons | The average training time was 5 min. The system is very cost-effective, efficient and prevents contamination during surgery. First experience of using the LMC to control CT and MRI images during surgery. |
| [ | To validate the possibility of performing precise telesurgical tasks by means of the LMC. | Comparative study of the Sigma.7 electro-mechanical device and the LMC. | Peg transferring task and answering a questionnaire. The success rate of peg transfers. | 10 researchers. | The results allowed the authors to confirm that fine tracking of the hand could be performed with the LMC. The observed performance of the optical interface proved to be comparable with that of traditional electro-mechanical devices. |
| [ | To describe a piece of software for image processing with OsiriX using finger gestures. | Proof-of-concept. | Manipulation of radiological images. | Not described. | It is possible to implement gesture control of medical devices with low-cost, minimal resources. The device is very sensitive to surface dirt and this affects performance. The device favors the occlusion phenomenon. |
| [ | To evaluate 2 contactless hand tracking systems, the LMC and MKd, for their potential to control surgical robots. | Experiment. | Manipulation of robots in surgery. | 4 trained surgeons. | Neither system has the high level of accuracy and robustness that would be required for controlling medical robots. |
| [ | To evaluate the LMC for simple 2-dimensional interaction and the action of entering a value. | Proof-of-concept and prototype testing. | Manipulation of medical information and operating room lights. | A 90-min conference on computer science and untrained users. | The user cases should be carefully classified and the most appropriate gestures for each application should be detected and implemented. Optimal lighting conditions for the LMC have still not been evaluated as unwanted light with deterioration of the IR light emitted may lead to a reduction in the recognition rate. |
| [ | To compare the average time required by the conventional method using a mouse and an operating method with a finger-motion sensor. | Observational study. | Manipulation of angiographic images. | 11 radiologists who observed a simulated clinical case. | After a practice time of 30 min, the average operation time by the finger method was significantly shorter than that by the mouse method. |
| [ | To develop a workstation that allows intraoperative touchless control of diagnostic and surgical images in dentistry. | Prototype user testing. | Manipulation of radiological images. | 2 surgeons. A case series of 11 dental surgery procedures. | The system performed very well. Its low cost favors its incorporation into clinical facilities of developing countries, reducing the number of staff required in operating rooms. |
| [ | To propose an interface to control hand gestures and gestures with hand-held tools. In this approach, hand-held tools can become gesture devices that the user can use to control the images. | Prototype user testing. | Manipulation of ultrasound images. | 12 participants. | Users were able to significantly improve their performance with practice. |
| [ | To develop a software application for the manipulation of a 3De pancreatic or liver tumor model by using CT and real-time elastography data. | Proof-of-concept. | Manipulation of CT and real-time elastography images. | 15 patients with liver cancer and 10 patients with pancreatic cancer. | A 3D model of liver and pancreatic tumors was successfully implemented with a hands-free interaction device suitable for sterile environments and for aiding diagnostic or therapeutic interventions. |
| [ | To present a new gesture recognition system for manipulating 2 surgical robots in a virtual simulator. | Proof-of-concept. | Manipulation of robots in surgery. | 2 surgical robots in a virtual simulator. | The device provided satisfactory accuracy and speed. It requires a more complete Application Programming Interface. |
| [ | To propose a web-based interface to retrieve medical images using gestures. | User testing. Pilot study. | Manipulation of radiological images. | 2 users. | User feedback was positive. Users reported fatigue with prolonged use of gestures. Additional studies are required to validate the interface. |
| [ | To describe the use of the LMC for image manipulation during hepatic transarterial chemoembolization and internal radiotherapy procedures. | Proof-of-concept. | Manipulation of images in interventional radiology. | Not described. | Gesture-based imaging control may lead to increased efficacy and safety with decreased radiation exposure during hepatic transarterial chemoembolization procedures. |
| [ | To compare 2 commercial motion sensors (MK and the LMC) to manipulate CT images, in terms of their utility, usability, speed, accuracy and user acceptance. | Two-strand sequential observational study. Qualitative and quantitative descriptive field study using a semi-structured questionnaire. | Manipulation of CT images. | 42 participants: radiologists, surgeons and interventional radiologists. | Marginal to average acceptability of the 2 devices. MK was found to be more useful and easier to use, but the LMC was more accurate. Further research is required to establish the design specifications, installation guidelines and user training requirements to ensure successful implementation in clinical areas. |
| [ | To evaluate a new method for image manipulation using a motion sensor. | Observational study. User testing and proof-of-concept. | Manipulation of radiological images in dentistry. | 14 students. 6 images. | Using the system, several processes can be performed quickly with finger movements. Using gestures was significantly superior to using a mouse in terms of time. |
| [ | To develop a new system for manipulating images using a motion sensor. | Observational study. | Manipulation of radiological images in dentistry. | 14 students. 25 images. | The operation time with the LMC was significantly shorter than with the conventional method using a mouse. |
| [ | To design a virtual 3D online environment for motor skills learning in MISf using exercises from the MISR-VR. The environment is designed in Unity, and the LMC is used as the device for interaction with the MIS forceps. | Letter to the editor. | None. | Not described | If it can be shown that 3D online environments mediated by natural user interfaces enable motor skills learning in MIS, a new field of research and development in the area of surgical simulation will be opened up. |
| [ | Patent for accurate 3D instrument positioning. | Patent. | None. | Not described | Representing, on an output display, 3D positions and orientations of an instrument while medical procedures are being performed. |
| [ | To describe the configuration for using the LMC in neurosurgery for image manipulation during a surgical procedure. | User testing. | Manipulation of images during a surgical procedure. | Resection of a meningioma and sarcoma surgery. | The learning curve only took 30 min. Although the main disadvantage was the lack of standardization of the gestures, the LMC is a low-cost, reliable and easily personalized device for controlling images in the surgical environment. |
| [ | To develop skills in students and professionals using computer simulation technologies based on hand gesture capture systems. | User testing. | Description of the virtual environment. | Not described. | Simulation and new gesture recognition technologies open up new possibilities for the generation of computer-mediated procedures for medical training. |
| [ | To present a gesture-controlled projection display that enables a direct and natural physician-machine interaction during CT-based interventions. | User testing (pilot and main). | 8 tasks manipulating CT images. | 12 participants (biomedical engineers, medical students and radiologists). | Gesture recognition is robust, although there is potential for improvement. The gesture training times are less than 10 min, but vary considerably between study participants. |
| [ | To develop an anatomy learning system using the LMC. | User testing. | Manipulation of 220 anatomical images. | 30 students and lecturers from an anatomy department. | The anatomy learning system using the LMC was successfully developed and it is suitable and acceptable as a support tool in an anatomy learning system. |
| [ | To study the possibility of tracking laparoscopic instruments using the LMC in a box trainer. | Experiment. | 3 static experiments and 1 dynamic experiment. | 1 user. | The LMC had acceptable precision for tracking laparoscopic instruments in a box trainer. |
| [ | To assess the potential of the LMC to track the movement of hands using MIS instruments. | Construct validity, concurrent validity. Comparative study with the InsTrac. | Passing a thread through pegs using the eoSim simulator. | 3 experts and 10 novices. | The LMC is able to track the movement of hands using instruments in a MIS box simulator. Construct validity was demonstrated. Concurrent validity was only demonstrated for time and instrument path distance. A number of limitations to the tracking method used by LMC have been identified. |
| [ | To explore the use of the LMC in endonasal pituitary surgery and to compare it with the Phantom Omni. | Comparative study between the LMC and the Phantom Omni. | 16 resections of simulated pituitary gland tumors using a robot manipulated by the Phantom Omni and by the LMC. | 3 neurosurgeons. | Users were able to achieve a very similar percentage of resection and procedure duration using the LMC. |
| [ | To try to interact with medical images via a web browser using the LMC. | Prototype user testing. | Rotation, panning, scaling and selection of slices of a reconstructed 3D model based on CT or MRI. | 1 user. | It is feasible to build this system and interaction can be carried out in real time. |
| [ | To analyze the value of 2 gesture input modalities (the Myo armband and the LMC) versus 2 clinically established methods (task delegation and joystick control). | User study. Comparative study. | Simulating a diagnostic neuroradiological vascular treatment with 2 frequently used interaction tasks in an experimental operating room. | 10 neuroradiologists | Novel input modalities have the potential to carry out single tasks more efficiently than clinically established methods. |
| [ | To investigate the potential of a virtual reality simulator for the assessment of basic laparoscopic skills, based on the LMC | Face and construct validity. | 3 basic tasks: camera navigation, instrument navigation, and two-handed operation. | 2 groups of surgeons (28 experts and 21 novices). | This study provides evidence of the potential use of the LMC for assessing basic laparoscopic skills. The proposed system allows the dexterity of hand movements to be evaluated. |
| [ | To evaluate the feasibility of using 3 different gesture control sensors (MK, the LMC and the Myo armband) to interact in a sterile manner with preoperative data as well as in settings of an integrated operating room during MIS. | Pilot user study. | 2 hepatectomies and 2 partial nephrectomies on an experimental porcine model. | 3 surgeons | Natural user interfaces are feasible for directly interacting, in a more intuitive and sterile manner, with preoperative images and integrated operating room functionalities during MIS. The combination of the Myo armband and voice commands provided the most intuitive and accurate natural user interface. |
| [ | To evaluate the LMC as a tool for the objective measurement and assessment of surgical dexterity among users at different experience levels. | Construct validity study. | Surgical knot tying and manual transfer of objects. | 11 participants. | The study showed 100% accuracy in discriminating between expert and novice performances. |
| [ | To design an affordable and easily accessible endoscopic third ventriculostomy simulator based on the LMC, and to compare it with the NeuroTouch for its usability and training effectiveness. | Concurrent and construct validity study. | 4 ellipsoid practice targeting tasks and 36 ventricle targeting tasks. | 16 novice users and 2 expert neurosurgeons | An easy-access simulator was created, which has the potential to become a training tool and a surgical training assessment tool. This system can be used for planning procedures using patient datasets. |
| [ | To present the LMC as a novel control device to manipulate the RAVEN-II robot. | Comparative study between the LMC and the electro-mechanical Sigma.7. | Comparison of peg manipulations during a training task with a contact-based device (Sigma.7). | 3 operators. | With contactless control, manipulability is not as good as it is with contact-based control. Complete control of the surgical instruments is feasible. This work is promising for the development of future human-machine interfaces dedicated to robotic surgical training systems. |
| [ | To evaluate the effect of using virtual reality surgery on the self-confidence and knowledge of surgical residents (the LMC and Oculus Rift). | Multisite, single-blind, parallel, randomized controlled trial. | The study group used the virtual reality surgery application. The control group used similar content in a standard presentation. | 95 residents from 7 dental schools. | Immersive virtual reality experiences improve the knowledge and self-confidence of the surgical residents. |
| [ | To develop and validate a novel training tool for Le Fort I osteotomy based on immersive virtual reality (the LMC and Oculus Rift). | Face and content validity. | A pre-intervention questionnaire to understand training needs and a postintervention feedback questionnaire. | 7 consultant oral and maxillofacial surgeons. | The results confirmed the clinical applicability of virtual reality for delivering training in orthognathic surgery. |
| [ | To investigate the feasibility and practicability of a low-cost multimodal head-mounted display system in neuroendoscopic surgery (the LMC and Oculus Rift). | Proof-of-concept in the operating room. | Ventriculocysto- cisternostomy. Ventriculostomy. Tumoral biopsy. | 21 patients with ventricular diseases. 1 neurosurgeon. | The head-mounted display system is feasible, practical, helpful, and relatively cost efficient in neuroendoscopic surgery. |
aLMC: Leap Motion Controller.
bCT: Computed Tomography.
cMRI: magnetic resonance imaging.
d3D: 3-dimensional.
eMK: Microsoft Kinect.
fMIS: minimally invasive surgery.