Literature DB >> 26890370

Assessing agreement with multiple raters on correlated kappa statistics.

Hongyuan Cao1, Pranab K Sen2, Anne F Peery3, Evan S Dellon3.   

Abstract

In clinical studies, it is often of interest to see the diagnostic agreement among clinicians on certain symptoms. Previous work has focused on the agreement between two clinicians under two different conditions or the agreement among multiple clinicians under one condition. Few have discussed the agreement study with a design where multiple clinicians examine the same group of patients under two different conditions. In this paper, we use the intraclass kappa statistic for assessing nominal scale agreement with such a design. We derive an explicit variance formula for the difference of correlated kappa statistics and conduct hypothesis testing for the equality of kappa statistics. Simulation studies show that the method performs well with realistic sample sizes and may be superior to a method that did not take into account the measurement dependence structure. The practical utility of the method is illustrated on data from an eosinophilic esophagitis (EoE) study.
© 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

Entities:  

Keywords:  Contingency table; Dependent kappa statistics; Multinomial distribution

Mesh:

Year:  2016        PMID: 26890370     DOI: 10.1002/bimj.201500029

Source DB:  PubMed          Journal:  Biom J        ISSN: 0323-3847            Impact factor:   2.207


  7 in total

1.  Measuring intrarater association between correlated ordinal ratings.

Authors:  Kerrie P Nelson; Thomas J Zhou; Don Edwards
Journal:  Biom J       Date:  2020-06-11       Impact factor: 2.207

2.  [A new method for agreement evaluation based on AC1].

Authors:  Jing-Wen Zhang; Jun Xu; Sheng-Li An
Journal:  Nan Fang Yi Ke Da Xue Xue Bao       Date:  2018-04-20

3.  Utility of a chemotherapy toxicity prediction tool for older patients in a community setting.

Authors:  C Mariano; R Jamal; P Bains; S Hejazi; L Chao; J Wan; J Ho
Journal:  Curr Oncol       Date:  2019-08-01       Impact factor: 3.677

4.  Methods of assessing categorical agreement between correlated screening tests in clinical studies.

Authors:  Thomas J Zhou; Sughra Raza; Kerrie P Nelson
Journal:  J Appl Stat       Date:  2020-06-09       Impact factor: 1.404

5.  Rating experiments in forestry: How much agreement is there in tree marking?

Authors:  Arne Pommerening; Carlos Pallarés Ramos; Wojciech Kędziora; Jens Haufe; Dietrich Stoyan
Journal:  PLoS One       Date:  2018-03-22       Impact factor: 3.240

6.  What About Their Performance Do Free Jazz Improvisers Agree Upon? A Case Study.

Authors:  Amandine Pras; Michael F Schober; Neta Spiro
Journal:  Front Psychol       Date:  2017-06-26

7.  Cohen's Kappa Coefficient as a Measure to Assess Classification Improvement following the Addition of a New Marker to a Regression Model.

Authors:  Barbara Więckowska; Katarzyna B Kubiak; Paulina Jóźwiak; Wacław Moryson; Barbara Stawińska-Witoszyńska
Journal:  Int J Environ Res Public Health       Date:  2022-08-17       Impact factor: 4.614

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.