| Literature DB >> 29735447 |
Jing-Wen Zhang1, Jun Xu, Sheng-Li An.
Abstract
OBJECTIVE: Medical studies use various methods for assessing agreement among different raters or measurement methods. Many of these coefficients have limitations, and among them the paradoxes of kappa are the best known. To achieve a higher accuracy and reliability, we propose an alternative statistic method based on AC1, known as CEA, which adjusts the chance agreement. We explored the influences of the prevalence rate and chance agreement probability on the total agreement and compared the accuracy and stability of kappa, AC1 and CEA coefficient through simulations and real data analysis. The proposed method offers a stable and reliable option for assessing agreement of binary data.Entities:
Mesh:
Year: 2018 PMID: 29735447 PMCID: PMC6765663
Source DB: PubMed Journal: Nan Fang Yi Ke Da Xue Xue Bao ISSN: 1673-4254