| Literature DB >> 33712598 |
Xiaodong Wang1, Ying Chen2, Yunshu Gao3, Huiqing Zhang4, Zehui Guan5, Zhou Dong5, Yuxuan Zheng1, Jiarui Jiang1, Haoqing Yang1, Liming Wang1, Xianming Huang4, Lirong Ai5, Wenlong Yu6, Hongwei Li7, Changsheng Dong7, Zhou Zhou7, Xiyang Liu8, Guanzhen Yu9,10.
Abstract
N-staging is a determining factor for prognostic assessment and decision-making for stage-based cancer therapeutic strategies. Visual inspection of whole-slides of intact lymph nodes is currently the main method used by pathologists to calculate the number of metastatic lymph nodes (MLNs). Moreover, even at the same N stage, the outcome of patients varies dramatically. Here, we propose a deep-learning framework for analyzing lymph node whole-slide images (WSIs) to identify lymph nodes and tumor regions, and then to uncover tumor-area-to-MLN-area ratio (T/MLN). After training, our model's tumor detection performance was comparable to that of experienced pathologists and achieved similar performance on two independent gastric cancer validation cohorts. Further, we demonstrate that T/MLN is an interpretable independent prognostic factor. These findings indicate that deep-learning models could assist not only pathologists in detecting lymph nodes with metastases but also oncologists in exploring new prognostic factors, especially those that are difficult to calculate manually.Entities:
Mesh:
Year: 2021 PMID: 33712598 PMCID: PMC7954798 DOI: 10.1038/s41467-021-21674-7
Source DB: PubMed Journal: Nat Commun ISSN: 2041-1723 Impact factor: 14.919