| Literature DB >> 24980159 |
Pengfei Shao1, Houzhu Ding, Jinkun Wang, Peng Liu, Qiang Ling, Jiayu Chen, Junbin Xu, Shiwu Zhang, Ronald Xu.
Abstract
A wearable surgical navigation system is developed for intraoperative imaging of surgical margin in cancer resection surgery. The system consists of an excitation light source, a monochromatic CCD camera, a host computer, and a wearable headset unit in either of the following two modes: head-mounted display (HMD) and Google glass. In the HMD mode, a CMOS camera is installed on a personal cinema system to capture the surgical scene in real-time and transmit the image to the host computer through a USB port. In the Google glass mode, a wireless connection is established between the glass and the host computer for image acquisition and data transport tasks. A software program is written in Python to call OpenCV functions for image calibration, co-registration, fusion, and display with augmented reality. The imaging performance of the surgical navigation system is characterized in a tumor simulating phantom. Image-guided surgical resection is demonstrated in an ex vivo tissue model. Surgical margins identified by the wearable navigation system are co-incident with those acquired by a standard small animal imaging system, indicating the technical feasibility for intraoperative surgical margin detection. The proposed surgical navigation system combines the sensitivity and specificity of a fluorescence imaging system and the mobility of a wearable goggle. It can be potentially used by a surgeon to identify the residual tumor foci and reduce the risk of recurrent diseases without interfering with the regular resection procedure.Entities:
Mesh:
Year: 2014 PMID: 24980159 PMCID: PMC4332818 DOI: 10.1007/s10439-014-1062-0
Source DB: PubMed Journal: Ann Biomed Eng ISSN: 0090-6964 Impact factor: 3.934