Literature DB >> 3814729

The kappa coefficient of agreement for multiple observers when the number of subjects is small.

S T Gross.   

Abstract

Published results on the use of the kappa coefficient of agreement have traditionally been concerned with situations where a large number of subjects is classified by a small group of raters. The coefficient is then used to assess the degree of agreement among the raters through hypothesis testing or confidence intervals. A modified kappa coefficient of agreement for multiple categories is proposed and a parameter-free distribution for testing null agreement is provided, for use when the number of raters is large relative to the number of categories and subjects. The large-sample distribution of kappa is shown to be normal in the nonnull case, and confidence intervals for kappa are provided. The results are extended to allow for an unequal number of raters per subject.

Mesh:

Year:  1986        PMID: 3814729

Source DB:  PubMed          Journal:  Biometrics        ISSN: 0006-341X            Impact factor:   2.571


  5 in total

1.  Phylogenetic analysis and classification of the fungal bHLH domain.

Authors:  Joshua K Sailsbery; William R Atchley; Ralph A Dean
Journal:  Mol Biol Evol       Date:  2011-11-22       Impact factor: 16.240

2.  Clinical utility of predictors of return-to-work outcome following work-related musculoskeletal injury.

Authors:  Heidi Muenchberger; Elizabeth Kendall; Peter Grimbeek; Travis Gee
Journal:  J Occup Rehabil       Date:  2007-11-30

3.  Clinical interpretation and implications of whole-genome sequencing.

Authors:  Frederick E Dewey; Megan E Grove; Cuiping Pan; Benjamin A Goldstein; Jonathan A Bernstein; Hassan Chaib; Jason D Merker; Rachel L Goldfeder; Gregory M Enns; Sean P David; Neda Pakdaman; Kelly E Ormond; Colleen Caleshu; Kerry Kingham; Teri E Klein; Michelle Whirl-Carrillo; Kenneth Sakamoto; Matthew T Wheeler; Atul J Butte; James M Ford; Linda Boxer; John P A Ioannidis; Alan C Yeung; Russ B Altman; Themistocles L Assimes; Michael Snyder; Euan A Ashley; Thomas Quertermous
Journal:  JAMA       Date:  2014-03-12       Impact factor: 56.272

4.  Kappa statistic for clustered dichotomous responses from physicians and patients.

Authors:  Chaeryon Kang; Bahjat Qaqish; Jane Monaco; Stacey L Sheridan; Jianwen Cai
Journal:  Stat Med       Date:  2013-03-27       Impact factor: 2.373

5.  Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial.

Authors:  Kevin A Hallgren
Journal:  Tutor Quant Methods Psychol       Date:  2012
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.