Literature DB >> 24262334

Pitfalls in the use of kappa when interpreting agreement between multiple raters in reliability studies.

Shaun O'Leary1, Marte Lund2, Tore Johan Ytre-Hauge3, Sigrid Reiersen Holm4, Kaja Naess5, Lars Nagelstad Dalland6, Steven M McPhail7.   

Abstract

OBJECTIVE: To compare different reliability coefficients (exact agreement, and variations of the kappa (generalised, Cohen's and Prevalence Adjusted and Biased Adjusted (PABAK))) for four physiotherapists conducting visual assessments of scapulae.
DESIGN: Inter-therapist reliability study.
SETTING: Research laboratory. PARTICIPANTS: 30 individuals with no history of neck or shoulder pain were recruited with no obvious significant postural abnormalities. MAIN OUTCOME MEASURES: Ratings of scapular posture were recorded in multiple biomechanical planes under four test conditions (at rest, and while under three isometric conditions) by four physiotherapists.
RESULTS: The magnitude of discrepancy between the two therapist pairs was 0.04 to 0.76 for Cohen's kappa, and 0.00 to 0.86 for PABAK. In comparison, the generalised kappa provided a score between the two paired kappa coefficients. The difference between mean generalised kappa coefficients and mean Cohen's kappa (0.02) and between mean generalised kappa and PABAK (0.02) were negligible, but the magnitude of difference between the generalised kappa and paired kappa within each plane and condition was substantial; 0.02 to 0.57 for Cohen's kappa and 0.02 to 0.63 for PABAK, respectively.
CONCLUSIONS: Calculating coefficients for therapist pairs alone may result in inconsistent findings. In contrast, the generalised kappa provided a coefficient close to the mean of the paired kappa coefficients. These findings support an assertion that generalised kappa may lead to a better representation of reliability between three or more raters and that reliability studies only calculating agreement between two raters should be interpreted with caution. However, generalised kappa may mask more extreme cases of agreement (or disagreement) that paired comparisons may reveal.
Copyright © 2013 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Agreement; Inter-therapist; Kappa; Posture; Reliability; Scapular

Mesh:

Year:  2013        PMID: 24262334     DOI: 10.1016/j.physio.2013.08.002

Source DB:  PubMed          Journal:  Physiotherapy        ISSN: 0031-9406            Impact factor:   3.358


  5 in total

1.  Inter-rater reliability of the McKenzie System of Mechanical Diagnosis and Therapy in the examination of the knee.

Authors:  Sean Willis; Richard Rosedale; Ravi Rastogi; Shawn M Robbins
Journal:  J Man Manip Ther       Date:  2016-09-07

2.  A Tool to Assess the Signs and Symptoms of Catheter-Associated Urinary Tract Infection: Development and Reliability.

Authors:  Tom J Blodgett; Sue E Gardner; Nicole P Blodgett; Lisa V Peterson; Melissa Pietraszak
Journal:  Clin Nurs Res       Date:  2014-09-22       Impact factor: 2.075

3.  Physiotherapists have accurate expectations of their patients' future health-related quality of life after first assessment in a subacute rehabilitation setting.

Authors:  Steven M McPhail; Emily Nalder; Anne-Marie Hill; Terry P Haines
Journal:  Biomed Res Int       Date:  2013-11-20       Impact factor: 3.411

4.  Development and validation of the Myasthenia Gravis Impairment Index.

Authors:  Carolina Barnett; Vera Bril; Moira Kapral; Abhaya Kulkarni; Aileen M Davis
Journal:  Neurology       Date:  2016-07-08       Impact factor: 9.910

5.  Development and Evaluation of the Supportive Needs Assessment Tool for Cirrhosis (SNAC).

Authors:  Patricia C Valery; Christina M Bernardes; Katherine A Stuart; Gunter F Hartel; Steven M McPhail; Richard Skoien; Tony Rahman; Paul J Clark; Leigh U Horsfall; Kelly L Hayward; Rohit Gupta; Elizabeth E Powell
Journal:  Patient Prefer Adherence       Date:  2020-03-18       Impact factor: 2.711

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.