Literature DB >> 30399507

A collaborative computer aided diagnosis (C-CAD) system with eye-tracking, sparse attentional model, and deep learning.

Naji Khosravan1, Haydar Celik2, Baris Turkbey2, Elizabeth C Jones2, Bradford Wood2, Ulas Bagci3.   

Abstract

Computer aided diagnosis (CAD) tools help radiologists to reduce diagnostic errors such as missing tumors and misdiagnosis. Vision researchers have been analyzing behaviors of radiologists during screening to understand how and why they miss tumors or misdiagnose. In this regard, eye-trackers have been instrumental in understanding visual search processes of radiologists. However, most relevant studies in this aspect are not compatible with realistic radiology reading rooms. In this study, we aim to develop a paradigm shifting CAD system, called collaborative CAD (C-CAD), that unifies CAD and eye-tracking systems in realistic radiology room settings. We first developed an eye-tracking interface providing radiologists with a real radiology reading room experience. Second, we propose a novel algorithm that unifies eye-tracking data and a CAD system. Specifically, we present a new graph based clustering and sparsification algorithm to transform eye-tracking data (gaze) into a graph model to interpret gaze patterns quantitatively and qualitatively. The proposed C-CAD collaborates with radiologists via eye-tracking technology and helps them to improve their diagnostic decisions. The C-CAD uses radiologists' search efficiency by processing their gaze patterns. Furthermore, the C-CAD incorporates a deep learning algorithm in a newly designed multi-task learning platform to segment and diagnose suspicious areas simultaneously. The proposed C-CAD system has been tested in a lung cancer screening experiment with multiple radiologists, reading low dose chest CTs. Promising results support the efficiency, accuracy and applicability of the proposed C-CAD system in a real radiology room setting. We have also shown that our framework is generalizable to more complex applications such as prostate cancer screening with multi-parametric magnetic resonance imaging (mp-MRI).
Copyright © 2018 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Attention; Eye-tracking; Graph sparsification; Lung cancer screening; Multi-task deep learning; Prostate cancer screening

Mesh:

Year:  2018        PMID: 30399507      PMCID: PMC6407631          DOI: 10.1016/j.media.2018.10.010

Source DB:  PubMed          Journal:  Med Image Anal        ISSN: 1361-8415            Impact factor:   8.545


  29 in total

1.  Random walks for image segmentation.

Authors:  Leo Grady
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2006-11       Impact factor: 6.226

2.  Investigating the link between radiologists' gaze, diagnostic decision, and image content.

Authors:  Georgia Tourassi; Sophie Voisin; Vincent Paquit; Elizabeth Krupinski
Journal:  J Am Med Inform Assoc       Date:  2013-06-20       Impact factor: 4.497

3.  Scanners and drillers: characterizing expert visual search through volumetric images.

Authors:  Trafton Drew; Melissa Le-Hoa Vo; Alex Olwal; Francine Jacobson; Steven E Seltzer; Jeremy M Wolfe
Journal:  J Vis       Date:  2013-08-06       Impact factor: 2.240

4.  Validation, comparison, and combination of algorithms for automatic detection of pulmonary nodules in computed tomography images: The LUNA16 challenge.

Authors:  Arnaud Arindra Adiyoso Setio; Alberto Traverso; Thomas de Bel; Moira S N Berens; Cas van den Bogaard; Piergiorgio Cerello; Hao Chen; Qi Dou; Maria Evelina Fantacci; Bram Geurts; Robbert van der Gugten; Pheng Ann Heng; Bart Jansen; Michael M J de Kaste; Valentin Kotov; Jack Yu-Hung Lin; Jeroen T M C Manders; Alexander Sóñora-Mengana; Juan Carlos García-Naranjo; Evgenia Papavasileiou; Mathias Prokop; Marco Saletta; Cornelia M Schaefer-Prokop; Ernst T Scholten; Luuk Scholten; Miranda M Snoeren; Ernesto Lopez Torres; Jef Vandemeulebroucke; Nicole Walasek; Guido C A Zuidhof; Bram van Ginneken; Colin Jacobs
Journal:  Med Image Anal       Date:  2017-07-13       Impact factor: 8.545

5.  SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation.

Authors:  Vijay Badrinarayanan; Alex Kendall; Roberto Cipolla
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2017-01-02       Impact factor: 6.226

Review 6.  A review of lung cancer screening and the role of computer-aided detection.

Authors:  B Al Mohammad; P C Brennan; C Mello-Thoms
Journal:  Clin Radiol       Date:  2017-02-06       Impact factor: 2.350

Review 7.  Review of prospects and challenges of eye tracking in volumetric imaging.

Authors:  Antje C Venjakob; Claudia R Mello-Thoms
Journal:  J Med Imaging (Bellingham)       Date:  2015-09-29

8.  Pulmonary Nodule Detection in CT Images: False Positive Reduction Using Multi-View Convolutional Networks.

Authors:  Arnaud Arindra Adiyoso Setio; Francesco Ciompi; Geert Litjens; Paul Gerke; Colin Jacobs; Sarah J van Riel; Mathilde Marie Winkler Wille; Matiullah Naqibullah; Clara I Sanchez; Bram van Ginneken
Journal:  IEEE Trans Med Imaging       Date:  2016-03-01       Impact factor: 10.048

9.  Does Expectation of Abnormality Affect the Search Pattern of Radiologists When Looking for Pulmonary Nodules?

Authors:  Stephen Littlefair; Patrick Brennan; Warren Reed; Claudia Mello-Thoms
Journal:  J Digit Imaging       Date:  2017-02       Impact factor: 4.056

10.  Deep Convolutional Neural Networks for Computer-Aided Detection: CNN Architectures, Dataset Characteristics and Transfer Learning.

Authors:  Hoo-Chang Shin; Holger R Roth; Mingchen Gao; Le Lu; Ziyue Xu; Isabella Nogues; Jianhua Yao; Daniel Mollura; Ronald M Summers
Journal:  IEEE Trans Med Imaging       Date:  2016-02-11       Impact factor: 10.048

View more
  6 in total

1.  REFLACX, a dataset of reports and eye-tracking data for localization of abnormalities in chest x-rays.

Authors:  Ricardo Bigolin Lanfredi; Mingyuan Zhang; William F Auffermann; Jessica Chan; Phuong-Anh T Duong; Vivek Srikumar; Trafton Drew; Joyce D Schroeder; Tolga Tasdizen
Journal:  Sci Data       Date:  2022-06-18       Impact factor: 8.501

2.  Development and multicenter validation of chest X-ray radiography interpretations based on natural language processing.

Authors:  Yaping Zhang; Mingqian Liu; Shundong Hu; Yao Shen; Jun Lan; Beibei Jiang; Geertruida H de Bock; Rozemarijn Vliegenthart; Xu Chen; Xueqian Xie
Journal:  Commun Med (Lond)       Date:  2021-10-28

3.  Integrating Eye Tracking and Speech Recognition Accurately Annotates MR Brain Images for Deep Learning: Proof of Principle.

Authors:  Joseph N Stember; Haydar Celik; David Gutman; Nathaniel Swinburne; Robert Young; Sarah Eskreis-Winkler; Andrei Holodny; Sachin Jambawalikar; Bradford J Wood; Peter D Chang; Elizabeth Krupinski; Ulas Bagci
Journal:  Radiol Artif Intell       Date:  2020-11-11

4.  Eye Tracking for Deep Learning Segmentation Using Convolutional Neural Networks.

Authors:  J N Stember; H Celik; E Krupinski; P D Chang; S Mutasa; B J Wood; A Lignelli; G Moonis; L H Schwartz; S Jambawalikar; U Bagci
Journal:  J Digit Imaging       Date:  2019-08       Impact factor: 4.056

5.  Creation and validation of a chest X-ray dataset with eye-tracking and report dictation for AI development.

Authors:  Satyananda Kashyap; Ismini Lourentzou; Joy T Wu; Alexandros Karargyris; Arjun Sharma; Matthew Tong; Shafiq Abedin; David Beymer; Vandana Mukherjee; Elizabeth A Krupinski; Mehdi Moradi
Journal:  Sci Data       Date:  2021-03-25       Impact factor: 6.444

Review 6.  Machine learning in patient flow: a review.

Authors:  Rasheed El-Bouri; Thomas Taylor; Alexey Youssef; Tingting Zhu; David A Clifton
Journal:  Prog Biomed Eng (Bristol)       Date:  2021-02-22
  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.