Andrew T Fairchild1, Jarred P Tanksley1, Jessica D Tenenbaum2, Manisha Palta1, Julian C Hong3. 1. Department of Radiation Oncology, Duke University, Durham, North Carolina. 2. Department of Biostatistics and Bioinformatics, Duke University, Durham, North Carolina. 3. Department of Radiation Oncology, Duke University, Durham, North Carolina; Department of Radiation Oncology, University of California, San Francisco, San Francisco, California; Bakar Computational Health Sciences Institute, University of California, San Francisco, San Francisco, California. Electronic address: julian.hong@ucsf.edu.
Abstract
PURPOSE: The National Cancer Institute Common Terminology Criteria for Adverse Events (CTCAE) v5.0 is the standard for oncology toxicity encoding and grading, despite limited validation. We assessed interrater reliability (IRR) in multireviewer toxicity identification. METHODS AND MATERIALS: Two reviewers independently reviewed 100 randomly selected notes for weekly on-treatment visits during radiation therapy from the electronic health record. Discrepancies were adjudicated by a third reviewer for consensus. Term harmonization was performed to account for overlapping symptoms in CTCAE. IRR was assessed based on unweighted and weighted Cohen's kappa coefficients. RESULTS: Between reviewers, the unweighted kappa was 0.68 (95% confidence interval, 0.65-0.71) and the weighted kappa was 0.59 (0.22-1.00). IRR was consistent between symptoms noted as present or absent with a kappa of 0.6 (0.66-0.71) and 0.6 (0.65-0.69), respectively. CONCLUSIONS: Significant discordance suggests toxicity identification, particularly retrospectively, is a complex and error-prone task. Strategies to minimize IRR, including training and simplification of the CTCAE criteria, should be considered in trial design and future terminologies.
PURPOSE: The National Cancer Institute Common Terminology Criteria for Adverse Events (CTCAE) v5.0 is the standard for oncology toxicity encoding and grading, despite limited validation. We assessed interrater reliability (IRR) in multireviewer toxicity identification. METHODS AND MATERIALS: Two reviewers independently reviewed 100 randomly selected notes for weekly on-treatment visits during radiation therapy from the electronic health record. Discrepancies were adjudicated by a third reviewer for consensus. Term harmonization was performed to account for overlapping symptoms in CTCAE. IRR was assessed based on unweighted and weighted Cohen's kappa coefficients. RESULTS: Between reviewers, the unweighted kappa was 0.68 (95% confidence interval, 0.65-0.71) and the weighted kappa was 0.59 (0.22-1.00). IRR was consistent between symptoms noted as present or absent with a kappa of 0.6 (0.66-0.71) and 0.6 (0.65-0.69), respectively. CONCLUSIONS: Significant discordance suggests toxicity identification, particularly retrospectively, is a complex and error-prone task. Strategies to minimize IRR, including training and simplification of the CTCAE criteria, should be considered in trial design and future terminologies.
Authors: Charlotta Lindvall; Chih-Ying Deng; Nicole D Agaronnik; Anne Kwok; Soujanya Samineni; Renato Umeton; Warren Mackie-Jenkins; Kenneth L Kehl; James A Tulsky; Andrea C Enzinger Journal: JCO Clin Cancer Inform Date: 2022-06
Authors: S K B Spohn; S Adebahr; M Huber; C Jenkner; R Wiehle; B Nagavci; C Schmucker; E G Carl; R C Chen; W A Weber; M Mix; A Rühle; T Sprave; N H Nicolay; C Gratzke; M Benndorf; T Wiegel; J Weis; D Baltas; A L Grosu; C Zamboglou Journal: BMC Cancer Date: 2022-03-28 Impact factor: 4.430
Authors: Charlotta Lindvall; Chih-Ying Deng; Edward Moseley; Nicole Agaronnik; Areej El-Jawahri; Michael K Paasche-Orlow; Joshua R Lakin; Angelo Volandes; The Acp-Peace Investigators James A Tulsky Journal: J Pain Symptom Manage Date: 2021-07-14 Impact factor: 5.576