Literature DB >> 18999026

A vector space method to quantify agreement in qualitative data.

Delano J McFarlane1, Jessica S Ancker, Rita Kukafka.   

Abstract

Interrater agreement in qualitative research is rarely quantified. We present a new method for assessing interrater agreement in the coding of focus group transcripts, based on vector space methods. We also demonstrate similarities between this vector method and two previously published interrater agreement methods. Using these methods, we showed that interrater agreement for the qualitative data was quite low, attributable in part to the subjective nature of the codes and in part to the very large number of possible codes. These methods of assessing interrater agreement have the potential to be useful in determining and improving reliability of qualitative codings.

Mesh:

Year:  2008        PMID: 18999026      PMCID: PMC2655941     

Source DB:  PubMed          Journal:  AMIA Annu Symp Proc        ISSN: 1559-4076


  4 in total

1.  Agreement, the f-measure, and reliability in information retrieval.

Authors:  George Hripcsak; Adam S Rothschild
Journal:  J Am Med Inform Assoc       Date:  2005-01-31       Impact factor: 4.497

2.  Health information seeking and technology use in Harlem - a pilot study using community-based participatory research.

Authors:  Yalini Senathirajah; Rita Kukafka; Mark Guptarak; Alwyn Cohall
Journal:  AMIA Annu Symp Proc       Date:  2006

3.  Digital partnerships for health: steps to develop a community-specific health portal aimed at promoting health and well-being.

Authors:  Rita Kukafka; Sharib A Khan; Carly Hutchinson; Delano J McFarlane; Jianhua Li; Jessica S Ancker; Alwyn Cohall
Journal:  AMIA Annu Symp Proc       Date:  2007-10-11

4.  On assessing interrater agreement for multiple attribute responses.

Authors:  L L Kupper; K B Hafner
Journal:  Biometrics       Date:  1989-09       Impact factor: 2.571

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.