Literature DB >> 29795849

A Ratio Test of Interrater Agreement With High Specificity.

Denis Cousineau1, Louis Laurencelle2.   

Abstract

Existing tests of interrater agreements have high statistical power; however, they lack specificity. If the ratings of the two raters do not show agreement but are not random, the current tests, some of which are based on Cohen's kappa, will often reject the null hypothesis, leading to the wrong conclusion that agreement is present. A new test of interrater agreement, applicable to nominal or ordinal categories, is presented. The test statistic can be expressed as a ratio (labeled QA , ranging from 0 to infinity) or as a proportion (labeled PA , ranging from 0 to 1). This test weighs information supporting agreement with information supporting disagreement. This new test's effectiveness (power and specificity) is compared with five other tests of interrater agreement in a series of Monte Carlo simulations. The new test, although slightly less powerful than the other tests reviewed, is the only one sensitive to agreement only. We also introduce confidence intervals on the proportion of agreement.

Entities:  

Keywords:  agreement test; interrater; kappa

Year:  2015        PMID: 29795849      PMCID: PMC5965602          DOI: 10.1177/0013164415574086

Source DB:  PubMed          Journal:  Educ Psychol Meas        ISSN: 0013-1644            Impact factor:   2.821


  2 in total

Review 1.  Kappa coefficients in medical research.

Authors:  Helena Chmura Kraemer; Vyjeyanthi S Periyakoil; Art Noda
Journal:  Stat Med       Date:  2002-07-30       Impact factor: 2.373

2.  The measurement of observer agreement for categorical data.

Authors:  J R Landis; G G Koch
Journal:  Biometrics       Date:  1977-03       Impact factor: 2.571

  2 in total
  2 in total

1.  An Evaluation of Interrater Reliability Measures on Binary Tasks Using d-Prime.

Authors:  Malcolm J Grant; Cathryn M Button; Brent Snook
Journal:  Appl Psychol Meas       Date:  2016-12-29

2.  Interrater reliability estimators tested against true interrater reliabilities.

Authors:  Xinshu Zhao; Guangchao Charles Feng; Song Harris Ao; Piper Liping Liu
Journal:  BMC Med Res Methodol       Date:  2022-08-29       Impact factor: 4.612

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.