Literature DB >> 29881002

A Monte Carlo-Based Bayesian Approach for Measuring Agreement in a Qualitative Scale.

Fernando Calle-Alonso1, Carlos Javier Pérez Sánchez1.   

Abstract

Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo-based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis.

Entities:  

Keywords:  Bayesian methodology; Monte Carlo methods; measures of agreement; multiple raters; prior elicitation

Year:  2014        PMID: 29881002      PMCID: PMC5978535          DOI: 10.1177/0146621614554080

Source DB:  PubMed          Journal:  Appl Psychol Meas        ISSN: 0146-6216


  22 in total

Review 1.  Modelling patterns of agreement and disagreement.

Authors:  A Agresti
Journal:  Stat Methods Med Res       Date:  1992       Impact factor: 3.021

2.  On scales of measurement in Autism Spectrum Disorders (ASD) and beyond: where Smitty went wrong.

Authors:  Domenic V Cicchetti
Journal:  J Autism Dev Disord       Date:  2014-02

3.  Using relative improvement over chance (RIOC) to examine agreement between tests: three case examples using studies of developmental coordination disorder (DCD) in children.

Authors:  John Cairney; David L Streiner
Journal:  Res Dev Disabil       Date:  2010-10-18

4.  2 x 2 kappa coefficients: measures of agreement or association.

Authors:  D A Bloch; H C Kraemer
Journal:  Biometrics       Date:  1989-03       Impact factor: 2.571

5.  Bayesian random effects for interrater and test-retest reliability with nested clinical observations.

Authors:  Chuhsing K Hsiao; Pei-Chun Chen; Wen-Hsin Kao
Journal:  J Clin Epidemiol       Date:  2011-02-02       Impact factor: 6.437

6.  A reappraisal of the kappa coefficient.

Authors:  W D Thompson; S D Walter
Journal:  J Clin Epidemiol       Date:  1988       Impact factor: 6.437

7.  Coefficients of agreement between observers and their interpretation.

Authors:  A E Maxwell
Journal:  Br J Psychiatry       Date:  1977-01       Impact factor: 9.319

8.  How reliable are chance-corrected measures of agreement?

Authors:  I Guggenmoos-Holzmann
Journal:  Stat Med       Date:  1993-12-15       Impact factor: 2.373

9.  Operational definitions of schizophrenia: what do they identify?

Authors:  M A Young; M A Tanner; H Y Meltzer
Journal:  J Nerv Ment Dis       Date:  1982-08       Impact factor: 2.254

10.  Research diagnostic criteria: rationale and reliability.

Authors:  R L Spitzer; J Endicott; E Robins
Journal:  Arch Gen Psychiatry       Date:  1978-06
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.