| Literature DB >> 32572154 |
Matthew G Hanna1, Victor E Reuter2,3, Orly Ardon2, David Kim4, Sahussapont Joseph Sirintrapun2,3, Peter J Schüffler2,3, Klaus J Busam2, Jennifer L Sauter2, Edi Brogi2, Lee K Tan2, Bin Xu2, Tejus Bale2, Narasimhan P Agaram2, Laura H Tang2, Lora H Ellenson2, John Philip2,5, Lorraine Corsale2, Evangelos Stamelos2, Maria A Friedlander2, Peter Ntiamoah2, Marc Labasin2, Christine England2, David S Klimstra2,3, Meera Hameed2,3.
Abstract
Remote digital pathology allows healthcare systems to maintain pathology operations during public health emergencies. Existing Clinical Laboratory Improvement Amendments regulations require pathologists to electronically verify patient reports from a certified facility. During the 2019 pandemic of COVID-19 disease, caused by the SAR-CoV-2 virus, this requirement potentially exposes pathologists, their colleagues, and household members to the risk of becoming infected. Relaxation of government enforcement of this regulation allows pathologists to review and report pathology specimens from a remote, non-CLIA certified facility. The availability of digital pathology systems can facilitate remote microscopic diagnosis, although formal comprehensive (case-based) validation of remote digital diagnosis has not been reported. All glass slides representing routine clinical signout workload in surgical pathology subspecialties at Memorial Sloan Kettering Cancer Center were scanned on an Aperio GT450 at ×40 equivalent resolution (0.26 µm/pixel). Twelve pathologists from nine surgical pathology subspecialties remotely reviewed and reported complete pathology cases using a digital pathology system from a non-CLIA certified facility through a secure connection. Whole slide images were integrated to and launched within the laboratory information system to a custom vendor-agnostic, whole slide image viewer. Remote signouts utilized consumer-grade computers and monitors (monitor size, 13.3-42 in.; resolution, 1280 × 800-3840 × 2160 pixels) connecting to an institution clinical workstation via secure virtual private network. Pathologists subsequently reviewed all corresponding glass slides using a light microscope within the CLIA-certified department. Intraobserver concordance metrics included reporting elements of top-line diagnosis, margin status, lymphovascular and/or perineural invasion, pathology stage, and ancillary testing. The median whole slide image file size was 1.3 GB; scan time/slide averaged 90 s; and scanned tissue area averaged 612 mm2. Signout sessions included a total of 108 cases, comprised of 254 individual parts and 1196 slides. Major diagnostic equivalency was 100% between digital and glass slide diagnoses; and overall concordance was 98.8% (251/254). This study reports validation of primary diagnostic review and reporting of complete pathology cases from a remote site during a public health emergency. Our experience shows high (100%) intraobserver digital to glass slide major diagnostic concordance when reporting from a remote site. This randomized, prospective study successfully validated remote use of a digital pathology system including operational feasibility supporting remote review and reporting of pathology specimens, and evaluation of remote access performance and usability for remote signout.Entities:
Mesh:
Year: 2020 PMID: 32572154 PMCID: PMC7306935 DOI: 10.1038/s41379-020-0601-5
Source DB: PubMed Journal: Mod Pathol ISSN: 0893-3952 Impact factor: 7.842
Fig. 1Operational workflow for digital pathology accessioning, staining, scanning, and slide distribution.
The modified operational workflow illustrates the process and roles involved from the point of accessioning to slide distribution.
Fig. 2Pathologist whole slide image viewer feedback and communication tool.
Users have a drop down list (shown above) available to report whole slide image quality problems. Once selected, a notification is delivered to the Digital Scanning Team to rescan the intended slide.
Technical evaluation of Aperio GT450 glass slide rescans.
| Total slides scanned | Barcode failure | No Tissue Detected | Tissue detection failure | No macro focus | Image quality | ||
|---|---|---|---|---|---|---|---|
| In house | 326 | 1 | |||||
| Consults | 392 | 16 | 1 | 1 | 1 | 1 | |
| In house | 119 | ||||||
| Consults | 55 | 2 | |||||
| In house | 117 | ||||||
| In house | 72 | 1 | |||||
| Consults | 30 | ||||||
| In house | 82 | ||||||
| In house | 34 | ||||||
| Consults | 19 | 1 | |||||
| In house | 22 | ||||||
| Consults | 34 | 2 | |||||
| In house | 41 | ||||||
| In house | 84 | 1 | 1 | ||||
| In house | 377 | 5 | 4 | ||||
| In house | 181 | ||||||
| In house | 51 | ||||||
| Consults | 13 | 1 | |||||
| In house | 67 | ||||||
| Consults | 3 | ||||||
Count of glass slides scanned on Aperio GT450 whole slide scanner with respective errors that were rescanned. The table shows chronological scanning of all specialties. After the genitourinary reader sessions, the vendor upgraded a software feature that decreased barcode scanning errors significantly.
GU genitourinary, Derm dermatopathology, Neuro neuropathology, GI gastrointestinal, H&N head and neck, BST bone & soft tissue, GYN gynecologic, All all surgical pathology specialties included in this study.
Case, read, and slide distribution by pathologist (reader).
| Cases | Reads D|G | Slides | |
|---|---|---|---|
| Pathologist A | 17 | 46|46 | 282 |
| Pathologist B | 15 | 51|51 | 300 |
| Pathologist C | 21 | 46|46 | 176 |
| Pathologist D | 7 | 8|8 | 40 |
| Pathologist E | 16 | 22|22 | 102 |
| Pathologist F | 6 | 13|13 | 77 |
| Pathologist G | 8 | 12|12 | 53 |
| Pathologist H | 5 | 29|29 | 56 |
| Pathologist I | 3 | 6|6 | 37 |
| Pathologist J | 4 | 7|7 | 31 |
| Pathologist K | 2 | 2|2 | 26 |
| Pathologist L | 4 | 8|8 | 16 |
Pathologist distribution of cases includes one or more specimen parts. A read is defined as the diagnosis for each specimen part. All specimen parts and slides were included for all cases reviewed.
D Digital, G Glass.
List of specimens in each respective subspecialty.
| Breast | 21 | Prostate | 151 | Orbit | 13 |
| Lymph node | 4 | Bladder | 28 | Larynx | 3 |
| Lymph nodes | 10 | Thyroid | 2 | ||
| Kidney | 9 | Nasal bone | 2 | ||
| Bone | 6 | Urethra | 6 | Canthus | 2 |
| Soft tissue | 3 | Testis | 3 | Maxilla | 2 |
| Ureter | 1 | Tonsil | 1 | ||
| Adrenal | 1 | Lymph node | 1 | ||
| Skin | 30 | Other | 10 | Skull base | 1 |
| Eye globe | 1 | ||||
| Thoracic | Other | 1 | |||
| Liver | 5 | Lung | 13 | ||
| Stomach | 2 | Lymph node | 6 | ||
| Colon | 2 | Bone | 2 | Ovary | 4 |
| Small bowel | 1 | Cervix | 3 | ||
| Rectum | 1 | Vagina | 1 | ||
| Gallbladder | 1 | Brain | 3 | ||
| Spine | 1 | ||||
Discordant cases between whole slide image and glass slide reads.
| Glass | WSI | Discordance | |
|---|---|---|---|
| Breast, biopsy | Lobular, pleomorphic type with apocrine features and lymphocytic infiltrate | Minor; Tumor site discrepancy | |
| Lung, biopsy | Adenocarcinoma, | Adenocarcinoma, acinar patterns | Minor; False negative |
| Thyroid, hemithyroidectomy | Papillary microcarcinoma, classic variant (0.46 cm) | Papillary microcarcinoma, classic variant (0.46 cm) | Minor; False negative |
Concordance between whole slide image and glass slide reads for all reader sessions.
| Validation performance and equivalency | Total parts | Part: Minor discordance | Part: Major discordance |
|---|---|---|---|
| 46 | 0 | 0 | |
| 51 | 0 | 0 | |
| 19 | 0 | 0 | |
| 8 | 1 | 0 | |
| 5 | 1 | 0 | |
| 4 | 0 | 0 | |
| 27 | 0 | 0 | |
| 5 | 0 | 0 | |
| 12 | 0 | 0 | |
| 12 | 0 | 0 | |
| 29 | 1 | 0 | |
| 6 | 0 | 0 | |
| 13 | 0 | 0 | |
| 2 | 0 | 0 | |
| 7 | 0 | 0 | |
| 8 | 0 | 0 |
Fig. 3Remote digital pathology experience survey.
Responses to the digital pathology experience survey showed a wide range in years of pathology practice and years using digital pathology. Of note, two pathologists had more years of experience using digital pathology (i.e., during residency) than practicing as board certified pathologists. The majority of respondents rated their experience positively. Left: (1) How many years have you been practicing pathology? (2) How many years of experience do you have using digital pathology (in any capacity)? Right: (3) Rate the digital pathology slide viewer. (4) Rate your satisfaction with the launching of slides from within the laboratory information system (CoPath). (5) Rate the quality of the digital slides. (6) Rate your satisfaction with the performance in navigating the digital slides. (7) How comfortable would you feel providing primary diagnosis using digital pathology, with retrieval of glass slides available upon request? (8) How comfortable would you feel providing primary diagnosis using digital pathology, without availability of glass slides?