Literature DB >> 18188809

[Analyzing interrater agreement for categorical data using Cohen's kappa and alternative coefficients].

M Wirtz1, M Kutschmann.   

Abstract

Within rehabilitation research ratings are one of the most frequently used assessment procedures. For example, therapists frequently make categorical judgements aiming to get information whether specific patient characteristics prevail or not (dichotomous rating format) or which of several alternatives holds for a patient (polytomous rating format). Interrater agreement is an important prerequisite to ensure that reliable and meaningful information concerning patients' state can be inferred from the data obtained. Cohen's kappa (Cohen's kappa) is the most frequently used measure to quantify interrater agreement. The properties of Cohen's kappa are characterized and conditions for the appropriate application of kappa are clarified. Because sometimes specific properties of kappa are not appropriately considered, misleading interpretations of this measure may easily arise. This is the case because the value of Cohen's kappa is affected by information aspects that are independent of the quality of the rating process. In order to avoid such misconceptions, alternative evaluation strategies are described for dichotomous rating formats which enhance agreement analysis and thus ensure a more valid interpretation. In addition, it is shown how weighted Cohen's kappa (omega) may be used to analyze polytomous rating formats.

Entities:  

Mesh:

Year:  2007        PMID: 18188809     DOI: 10.1055/s-2007-976535

Source DB:  PubMed          Journal:  Rehabilitation (Stuttg)        ISSN: 0034-3536            Impact factor:   1.113


  3 in total

1.  Inter- and intraobserver reliabilities and critical analysis of the osteoporotic fracture classification of osteoporotic vertebral body fractures.

Authors:  Maria Schönrogge; Vadzim Lahodski; Ronny Otto; Daniela Adolf; Robert Damm; Albrecht Sitte-Zöllner; Stefan Piatek
Journal:  Eur Spine J       Date:  2022-04-05       Impact factor: 2.721

2.  An instrument for quality assurance in work capacity evaluation: development, evaluation, and inter-rater reliability.

Authors:  André Strahl; Christian Gerlich; Georg W Alpers; Jörg Gehrke; Annette Müller-Garnn; Heiner Vogel
Journal:  BMC Health Serv Res       Date:  2019-08-09       Impact factor: 2.655

3.  Development and evaluation of a standardized peer-training in the context of peer review for quality assurance in work capacity evaluation.

Authors:  André Strahl; Christian Gerlich; Georg W Alpers; Katja Ehrmann; Jörg Gehrke; Annette Müller-Garnn; Heiner Vogel
Journal:  BMC Med Educ       Date:  2018-06-13       Impact factor: 2.463

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.