Qiaoliang Li1, Shiyu Li1, Xinyu Liu1, Zhuoying He1, Tao Wang1, Ying Xu1, Huimin Guan1, Runmin Chen1, Suwen Qi1, Feng Wang2. 1. National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, Department of Biomedical Engineering, School of Medicine, Shenzhen University, Xueyuan Avenue, Nanshan District, Shenzhen, 518071, China. 2. Department of Clinical Laboratory, Shenzhen Baoan Women's and Children's Hospital, Shenzhen, 518071, China.
Abstract
PURPOSE: To automate the detection and identification of visible components in feces for early diagnosis of gastrointestinal diseases, we propose FecalNet, a method using multiple deep neural networks. METHODS: FecalNet uses the ResNet152 residual network to extract and learn the characteristics of visible components in fecal microscopic images, acquire feature maps in combination with the feature pyramid network, apply the full convolutional network to classify and locate the fecal components, and implement the improved focal loss function to reoptimize the classification results. This allowed the complete automation of the detection and identification of the visible components in feces. RESULTS: We validated this method using a fecal database of 1,122 patients. The results indicated a mean average precision (mAP) of 92.16% and an average recall (AR) of 93.56%. The average precision (AP) and AR of erythrocyte, leukocyte, intestinal mucosal epithelial cells, hookworm eggs, ascarid eggs, and whipworm eggs were 92.82% and 93.38%, 93.99% and 96.11%, 90.71% and 92.41%, 89.95% and 93.88%, 96.90% and 91.21%, and 88.61% and 94.37%, respectively. The average times required by the GPU and the CPU to analyze a fecal microscopic image are approximately 0.14 and 1.02 s, respectively. CONCLUSION: FecalNet can automate the detection and identification of visible components in feces. It also provides a detection and identification framework for detecting several other types of cells in clinical practice.
PURPOSE: To automate the detection and identification of visible components in feces for early diagnosis of gastrointestinal diseases, we propose FecalNet, a method using multiple deep neural networks. METHODS: FecalNet uses the ResNet152 residual network to extract and learn the characteristics of visible components in fecal microscopic images, acquire feature maps in combination with the feature pyramid network, apply the full convolutional network to classify and locate the fecal components, and implement the improved focal loss function to reoptimize the classification results. This allowed the complete automation of the detection and identification of the visible components in feces. RESULTS: We validated this method using a fecal database of 1,122 patients. The results indicated a mean average precision (mAP) of 92.16% and an average recall (AR) of 93.56%. The average precision (AP) and AR of erythrocyte, leukocyte, intestinal mucosal epithelial cells, hookworm eggs, ascarid eggs, and whipworm eggs were 92.82% and 93.38%, 93.99% and 96.11%, 90.71% and 92.41%, 89.95% and 93.88%, 96.90% and 91.21%, and 88.61% and 94.37%, respectively. The average times required by the GPU and the CPU to analyze a fecal microscopic image are approximately 0.14 and 1.02 s, respectively. CONCLUSION: FecalNet can automate the detection and identification of visible components in feces. It also provides a detection and identification framework for detecting several other types of cells in clinical practice.
Authors: Peter Ward; Peter Dahlberg; Ole Lagatie; Joel Larsson; August Tynong; Johnny Vlaminck; Matthias Zumpe; Shaali Ame; Mio Ayana; Virak Khieu; Zeleke Mekonnen; Maurice Odiere; Tsegaye Yohannes; Sofie Van Hoecke; Bruno Levecke; Lieven J Stuyver Journal: PLoS Negl Trop Dis Date: 2022-06-17
Authors: Elena Dacal; David Bermejo-Peláez; Lin Lin; Elisa Álamo; Daniel Cuadrado; Álvaro Martínez; Adriana Mousa; María Postigo; Alicia Soto; Endre Sukosd; Alexander Vladimirov; Charles Mwandawiro; Paul Gichuki; Nana Aba Williams; José Muñoz; Stella Kepha; Miguel Luengo-Oroz Journal: PLoS Negl Trop Dis Date: 2021-09-07