Literature DB >> 20137998

Automatic detection of informative frames from wireless capsule endoscopy images.

M K Bashar1, T Kitasaka, Y Suenaga, Y Mekada, K Mori.   

Abstract

Wireless capsule endoscopy (WCE) is a new clinical technology permitting visualization of the small bowel, the most difficult segment of the digestive tract. The major drawback of this technology is the excessive amount of time required for video diagnosis. We therefore propose a method for generating smaller videos by detecting informative frames from original WCE videos. This method isolates useless frames that are highly contaminated by turbid fluids, faecal materials and/or residual foods. These materials and fluids are presented in a wide range of colors, from brown to yellow, and/or have bubble-like texture patterns. The detection scheme therefore consists of two steps: isolating (Step-1) highly contaminated non-bubbled (HCN) frames and (Step-2) significantly bubbled (SB) frames. Two color representations, viz., local color moments in Ohta space and the HSV color histogram, are attempted to characterize HCN frames, which are isolated by a support vector machine (SVM) classifier in Step-1. The rest of the frames go to Step-2, where a Gauss Laguerre transform (GLT) based multiresolution texture feature is used to characterize the bubble structures in WCE frames. GLT uses Laguerre Gauss circular harmonic functions (LG-CHFs) to decompose WCE images into multiresolution components. An automatic method of segmentation was designed to extract bubbled regions from grayscale versions of the color images based on the local absolute energies of their CHF responses. The final informative frames were detected by using a threshold on the segmented regions. An automatic procedure for selecting features based on analyzing the consistency of the energy-contrast map is also proposed. Three experiments, two of which use 14,841 and 37,100 frames from three videos and the rest uses 66,582 frames from six videos, were conducted for justifying the proposed method. The two combinations of the proposed color and texture features showed excellent average detection accuracies (86.42% and 84.45%) with the final experiment, when compared with the same color features followed by conventional Gabor-based (78.18% and 76.29%) and discrete wavelet-based (65.43% and 63.83%) texture features. Although intra-video training-testing cases are typical choices for supervised classification in Step-1, combining a suitable number of training sets using a subset of the input videos was shown to be possible. This mixing not only reduced computation costs but also produced better detection accuracies by minimizing visual-selection errors, especially when processing large numbers of WCE videos. Copyright (c) 2010 Elsevier B.V. All rights reserved.

Entities:  

Mesh:

Year:  2010        PMID: 20137998     DOI: 10.1016/j.media.2009.12.001

Source DB:  PubMed          Journal:  Med Image Anal        ISSN: 1361-8415            Impact factor:   8.545


  8 in total

1.  Quantitative estimates of motility from videocapsule endoscopy are useful to discern celiac patients from controls.

Authors:  Edward J Ciaccio; Christina A Tennyson; Govind Bhagat; Suzanne K Lewis; Peter H Green
Journal:  Dig Dis Sci       Date:  2012-05-30       Impact factor: 3.199

2.  Video summarization based tele-endoscopy: a service to efficiently manage visual data generated during wireless capsule endoscopy procedure.

Authors:  Irfan Mehmood; Muhammad Sajjad; Sung Wook Baik
Journal:  J Med Syst       Date:  2014-07-19       Impact factor: 4.460

3.  Implementation of a polling protocol for predicting celiac disease in videocapsule analysis.

Authors:  Edward J Ciaccio; Christina A Tennyson; Govind Bhagat; Suzanne K Lewis; Peter H Green
Journal:  World J Gastrointest Endosc       Date:  2013-07-16

Review 4.  Review: capsule colonoscopy-a concise clinical overview of current status.

Authors:  Diana E Yung; Emanuele Rondonotti; Anastasios Koulaouzidis
Journal:  Ann Transl Med       Date:  2016-10

5.  Efficient Bronchoscopic Video Summarization.

Authors:  Patrick D Byrnes; William Evan Higgins
Journal:  IEEE Trans Biomed Eng       Date:  2018-07-24       Impact factor: 4.538

6.  Training and deploying a deep learning model for endoscopic severity grading in ulcerative colitis using multicenter clinical trial data.

Authors:  Benjamin Gutierrez Becker; Filippo Arcadu; Andreas Thalhammer; Citlalli Gamez Serna; Owen Feehan; Faye Drawnel; Young S Oh; Marco Prunotto
Journal:  Ther Adv Gastrointest Endosc       Date:  2021-02-25

Review 7.  Artificial intelligence in small intestinal diseases: Application and prospects.

Authors:  Yu Yang; Yu-Xuan Li; Ren-Qi Yao; Xiao-Hui Du; Chao Ren
Journal:  World J Gastroenterol       Date:  2021-07-07       Impact factor: 5.742

8.  Machine Learning-Based Classification of the Health State of Mice Colon in Cancer Study from Confocal Laser Endomicroscopy.

Authors:  Pejman Rasti; Christian Wolf; Hugo Dorez; Raphael Sablong; Driffa Moussata; Salma Samiei; David Rousseau
Journal:  Sci Rep       Date:  2019-12-27       Impact factor: 4.379

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.