Literature DB >> 33609920

VR-Caps: A Virtual Environment for Capsule Endoscopy.

Kağan İncetan1, Ibrahim Omer Celik2, Abdulhamid Obeid1, Guliz Irem Gokceler1, Kutsev Bengisu Ozyoruk1, Yasin Almalioglu3, Richard J Chen4, Faisal Mahmood5, Hunter Gilbert6, Nicholas J Durr7, Mehmet Turan8.   

Abstract

Current capsule endoscopes and next-generation robotic capsules for diagnosis and treatment of gastrointestinal diseases are complex cyber-physical platforms that must orchestrate complex software and hardware functions. The desired tasks for these systems include visual localization, depth estimation, 3D mapping, disease detection and segmentation, automated navigation, active control, path realization and optional therapeutic modules such as targeted drug delivery and biopsy sampling. Data-driven algorithms promise to enable many advanced functionalities for capsule endoscopes, but real-world data is challenging to obtain. Physically-realistic simulations providing synthetic data have emerged as a solution to the development of data-driven algorithms. In this work, we present a comprehensive simulation platform for capsule endoscopy operations and introduce VR-Caps, a virtual active capsule environment that simulates a range of normal and abnormal tissue conditions (e.g., inflated, dry, wet etc.) and varied organ types, capsule endoscope designs (e.g., mono, stereo, dual and 360∘ camera), and the type, number, strength, and placement of internal and external magnetic sources that enable active locomotion. VR-Caps makes it possible to both independently or jointly develop, optimize, and test medical imaging and analysis software for the current and next-generation endoscopic capsule systems. To validate this approach, we train state-of-the-art deep neural networks to accomplish various medical image analysis tasks using simulated data from VR-Caps and evaluate the performance of these models on real medical data. Results demonstrate the usefulness and effectiveness of the proposed virtual platform in developing algorithms that quantify fractional coverage, camera trajectory, 3D map reconstruction, and disease classification. All of the code, pre-trained weights and created 3D organ models of the virtual environment with detailed instructions how to setup and use the environment are made publicly available at https://github.com/CapsuleEndoscope/VirtualCapsuleEndoscopy and a video demonstration can be seen in the supplementary videos (Video-I).
Copyright © 2021 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Area coverage; Capsule endoscopy; Deep reinforcement learning; Disease classification; Synthetic data generation

Year:  2021        PMID: 33609920     DOI: 10.1016/j.media.2021.101990

Source DB:  PubMed          Journal:  Med Image Anal        ISSN: 1361-8415            Impact factor:   8.545


  5 in total

1.  Joint estimation of depth and motion from a monocular endoscopy image sequence using a multi-loss rebalancing network.

Authors:  Shiyuan Liu; Jingfan Fan; Dengpan Song; Tianyu Fu; Yucong Lin; Deqiang Xiao; Hong Song; Yongtian Wang; Jian Yang
Journal:  Biomed Opt Express       Date:  2022-04-11       Impact factor: 3.562

2.  CLTS-GAN: Color-Lighting-Texture-Specular Reflection Augmentation for Colonoscopy.

Authors:  Shawn Mathew; Saad Nadeem; Arie Kaufman
Journal:  Med Image Comput Comput Assist Interv       Date:  2022-09-17

3.  FoldIt: Haustral Folds Detection and Segmentation in Colonoscopy Videos.

Authors:  Shawn Mathew; Saad Nadeem; Arie Kaufman
Journal:  Med Image Comput Comput Assist Interv       Date:  2021-09-21

4.  Gastrointestinal Tract Disease Classification from Wireless Endoscopy Images Using Pretrained Deep Learning Model.

Authors:  J Yogapriya; Venkatesan Chandran; M G Sumithra; P Anitha; P Jenopaul; C Suresh Gnana Dhas
Journal:  Comput Math Methods Med       Date:  2021-09-11       Impact factor: 2.238

Review 5.  Deep learning for gastroscopic images: computer-aided techniques for clinicians.

Authors:  Ziyi Jin; Tianyuan Gan; Peng Wang; Zuoming Fu; Chongan Zhang; Qinglai Yan; Xueyong Zheng; Xiao Liang; Xuesong Ye
Journal:  Biomed Eng Online       Date:  2022-02-11       Impact factor: 2.819

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.