| Literature DB >> 34589980 |
Pei Yuan1, Changyuan Guo1, Lin Li1, Lei Guo1, Fanshuang Zhang1, Jianming Ying1.
Abstract
INTRODUCTION: Accurate results on the status of programmed cell death-ligand 1 (PD-L1) rely on not only the quality of immunohistochemistry testing but also the accuracy of the pathologic assessments. We explored the intraobserver and interobserver reproducibility of the interpretations for the companion diagnostics, the Dako PD-L1 22C3 pharmDx kit (Dako North America, Inc, Carpinteria, CA) and the VENTANA PD-L1 (SP263, Ventana Medical Systems, Inc, Tucson, AZ) assay, and the consistency between microscopic and digital interpretations of PD-L1.Entities:
Keywords: 22C3; Assessment; PD-L1; Reproducibility; SP263
Year: 2020 PMID: 34589980 PMCID: PMC8474465 DOI: 10.1016/j.jtocrr.2020.100102
Source DB: PubMed Journal: JTO Clin Res Rep ISSN: 2666-3643
Figure 1Flow diagram revealing the study design. IP, interpretation pathologist; TPS, tumor proportion score.
Intraobserver and Interobserver Reproducibility of the 22C3 Assay
| Measurements | Intraobserver (N = 400) | Interobserver (N = 19,000) | ||
|---|---|---|---|---|
| 1% | 50% | 1% | 50% | |
| CPs | 368 (92.0%) | 356 (89.0%) | 16,468 (86.7%) | 16,948 (89.2%) |
| Negative-negative | 35 (8.8%) | 281 (70.3%) | 3940 (20.7%) | 12,179 (64.1%) |
| Positive-positive | 333 (83.2%) | 75 (18.7%) | 12,528 (66.0%) | 4769 (25.1%) |
| DCPs | 32 (8.0%) | 44 (11.0%) | 2532 (13.3%) | 2052 (10.8%) |
| Measures of agreement (95% CI) | ||||
| OPA (%) | 92.0 (89.3–94.7) | 89.0 (85.9–92.1) | 86.7 (86.2–87.1) | 89.2 (88.8–89.6) |
| NPA (%) | 68.6 (55.9–81.4) | 92.7 (89.8–95.7) | 75.7 (74.5–76.8) | 92.2 (91.5–93.0) |
| PPA (%) | 95.4 (93.2–97.6) | 77.3 (69.0–85.7) | 90.8 (90.3–91.3) | 82.3 (81.7–82.9) |
CI, confidence interval; CP, concordant pair; DCP, discordant CP; IP, interpretation pathologist; NPA, negative percentage agreement; OPA, overall percentage agreement; PPA, positive percentage agreement.
N = 20 (the number of IPs) × 100 (the number of cases).
N = (the number of comparison pairs of each case) × 100 (the number of cases).
The Reliability Among the Pathologists for Binary Tumor Evaluations With the Specific Cutoff Points
| PD-L1 Clone/Cutoff, % | Fleiss’ κ | Interpretation |
|---|---|---|
| 22C3 | ||
| 1 | 0.67 | Substantial |
| 50 | 0.75 | Substantial |
| SP263 | ||
| 1 | 0.7 | Substantial |
| 25 | 0.46 | Moderate |
| 50 | 0.54 | Moderate |
PD-L1, programmed cell death-ligand 1.
Interobserver Reproducibility of the SP263 Assay
| Measurements | SP263 (N = 8360) | ||
|---|---|---|---|
| 1% | 25% | 50% | |
| CPs | 7708 (92.2%) | 6105 (73.0%) | 6849 (81.9%) |
| Negative-negative | 947 (11.3%) | 3157 (37.8%) | 5353 (64.0%) |
| Positive-positive | 6761 (80.9%) | 2948 (35.2%) | 1496 (17.9%) |
| DCPs | 652 (7.8%) | 2255 (27.0%) | 1511 (19.1%) |
| Measures of agreement (95% CI) | |||
| OPA (%) | 92.2 (91.6–92.8) | 73.0 (72.1–74.0) | 81.9 (81.1–82.8) |
| NPA (%) | 74.4 (72.0–76.8) | 73.7 (72.4–75.0) | 87.6 (86.8–88.5) |
| PPA (%) | 95.4 (94.9–95.9) | 72.3 (71.0–73.7) | 66.4 (64.5–68.4) |
CI, confidence interval; CP, concordant pair; DCP, discordant CP; NPA, negative percentage agreement; OPA, overall percentage agreement; PPA, positive percentage agreement.
N = (the number of comparison pairs of each case) × 44 (the number of cases).
Interobserver Reproducibility of Assessment of the 22C3 Assay in Microscopic and Digital Interpretations
| Measurements | 22C3 (N = 3300) | |||
|---|---|---|---|---|
| Microscopic Interpretation | Digital Interpretation | |||
| 1% | 50% | 1% | 50% | |
| CPs | 2957 (83.5%) | 2997 (90.8%) | 3050 (92.4%) | 3005 (91.1%) |
| Negative-negative | 681 (20.6%) | 1823 (55.2%) | 667 (20.2%) | 1794 (54.4%) |
| Positive-positive | 2276 (62.9%) | 1174 (35.6%) | 2383 (72.2%) | 1211 (36.7%) |
| DCPs | 343 (16.5%) | 303 (9.2%) | 250 (7.6%) | 295 (8.9%) |
| Measures of agreement (95% CI) | ||||
| OPA (%) | 83.5 (82.2–84.7) | 90.8 (89.8–91.8) | 92.4 (91.5–93.3) | 91.1 (90.1–92.0) |
| NPA (%) | 79.9 (77.2–82.6) | 92.3 (91.2–93.5) | 84.2 (81.7–86.8) | 92.4 (91.2–93.6) |
| PPA (%) | 93.0 (92.0–94.0) | 88.6 (86.9–90.3) | 95.0 (94.2–95.9) | 89.1 (87.5–90.8) |
| Kappa | 0.73 (0.70–0.75) | 0.81(0.79–0.83) | 0.79 (0.77–0.82) | 0.82 (0.79–0.83) |
CI, confidence interval; CP, concordant pair; DCP, discordant CP; NPA, negative percentage agreement; OPA, overall percentage agreement; PPA, positive percentage agreement.
N = (the number of comparison pairs of each case) × 50 (the number of cases).
The Consistency Between Microscopic and Digital Interpretations
| Measurements | 22C3 (N = 600) | |
|---|---|---|
| 1% | 50% | |
| CPs | 561 (93.5%) | 552 (92.0%) |
| Negative-negative | 130 (21.7%) | 332 (55.3%) |
| Positive-positive | 431 (71.8%) | 220 (36.7%) |
| DCPs | 39 (6.5%) | 48 (8.0%) |
| Measures of agreement (95% CI) | ||
| OPA (%) | 93.5 (91.5–95.5) | 92.0 (89.8–94.2) |
| Ρ | 0.83 (0.77–0.88) | 0.83 (0.77–0.88) |
CI, confidence interval; CP, concordant pair; DCP, discordant CP; OPA, overall percentage agreement.
N = 12 (the number of IPs) × 50 (the number of cases).