Literature DB >> 29735447

[A new method for agreement evaluation based on AC1].

Jing-Wen Zhang1, Jun Xu, Sheng-Li An.   

Abstract

OBJECTIVE: Medical studies use various methods for assessing agreement among different raters or measurement methods. Many of these coefficients have limitations, and among them the paradoxes of kappa are the best known. To achieve a higher accuracy and reliability, we propose an alternative statistic method based on AC1, known as CEA, which adjusts the chance agreement. We explored the influences of the prevalence rate and chance agreement probability on the total agreement and compared the accuracy and stability of kappa, AC1 and CEA coefficient through simulations and real data analysis. The proposed method offers a stable and reliable option for assessing agreement of binary data.

Entities:  

Mesh:

Year:  2018        PMID: 29735447      PMCID: PMC6765663     

Source DB:  PubMed          Journal:  Nan Fang Yi Ke Da Xue Xue Bao        ISSN: 1673-4254


  17 in total

1.  Relationships between statistical measures of agreement: sensitivity, specificity and kappa.

Authors:  Martin Feuerman; Allen R Miller
Journal:  J Eval Clin Pract       Date:  2008-10       Impact factor: 2.431

2.  Normalization of mean squared differences to measure agreement for continuous data.

Authors:  Rashid Almehrizi
Journal:  Stat Methods Med Res       Date:  2013-11-06       Impact factor: 3.021

3.  Computing inter-rater reliability and its variance in the presence of high agreement.

Authors:  Kilem Li Gwet
Journal:  Br J Math Stat Psychol       Date:  2008-05       Impact factor: 3.380

4.  Comparison of performance-based and self-rated functional capacity in Spanish elderly.

Authors:  M Ferrer; R Lamarca; F Orfila; J Alonso
Journal:  Am J Epidemiol       Date:  1999-02-01       Impact factor: 4.897

5.  Measuring interobserver variation in a pathology EQA scheme using weighted κ for multiple readers.

Authors:  Karen C Wright; Jane Melia; Sue Moss; Dan M Berney; Derek Coleman; Patricia Harnden
Journal:  J Clin Pathol       Date:  2011-08-11       Impact factor: 3.411

6.  High agreement but low kappa: II. Resolving the paradoxes.

Authors:  D V Cicchetti; A R Feinstein
Journal:  J Clin Epidemiol       Date:  1990       Impact factor: 6.437

7.  A New Interpretation of the Weighted Kappa Coefficients.

Authors:  Sophie Vanbelle
Journal:  Psychometrika       Date:  2014-12-17       Impact factor: 2.500

8.  Assessing the inter-rater agreement for ordinal data through weighted indexes.

Authors:  Donata Marasini; Piero Quatto; Enrico Ripamonti
Journal:  Stat Methods Med Res       Date:  2014-04-16       Impact factor: 3.021

9.  Assessing interrater agreement on binary measurements via intraclass odds ratio.

Authors:  Isabella Locatelli; Valentin Rousson
Journal:  Biom J       Date:  2016-03-14       Impact factor: 2.207

10.  Kappa coefficient: a popular measure of rater agreement.

Authors:  Wan Tang; Jun Hu; Hui Zhang; Pan Wu; Hua He
Journal:  Shanghai Arch Psychiatry       Date:  2015-02-25
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.