| Literature DB >> 33767191 |
Satyananda Kashyap1, Ismini Lourentzou2,3, Joy T Wu2, Alexandros Karargyris4, Arjun Sharma2, Matthew Tong2, Shafiq Abedin2, David Beymer2, Vandana Mukherjee2, Elizabeth A Krupinski5, Mehdi Moradi6.
Abstract
We developed a rich dataset of Chest X-Ray (CXR) images to assist investigators in artificial intelligence. The data were collected using an eye-tracking system while a radiologist reviewed and reported on 1,083 CXR images. The dataset contains the following aligned data: CXR image, transcribed radiology report text, radiologist's dictation audio and eye gaze coordinates data. We hope this dataset can contribute to various areas of research particularly towards explainable and multimodal deep learning/machine learning methods. Furthermore, investigators in disease classification and localization, automated radiology report generation, and human-machine interaction can benefit from these data. We report deep learning experiments that utilize the attention maps produced by the eye gaze dataset to show the potential utility of this dataset.Entities:
Mesh:
Year: 2021 PMID: 33767191 PMCID: PMC7994908 DOI: 10.1038/s41597-021-00863-5
Source DB: PubMed Journal: Sci Data ISSN: 2052-4463 Impact factor: 6.444