| Literature DB >> 12955968 |
Christine A Wynd1, Bruce Schmidt, Michelle Atkins Schaefer.
Abstract
Instrument content validity is often established through qualitative expert reviews, yet quantitative analysis of reviewer agreements is also advocated in the literature. Two quantitative approaches to content validity estimations were compared and contrasted using a newly developed instrument called the Osteoporosis Risk Assessment Tool (ORAT). Data obtained from a panel of eight expert judges were analyzed. A Content Validity Index (CVI) initially determined that only one item lacked interrater proportion agreement about its relevance to the instrument as a whole (CVI = 0.57). Concern that higher proportion agreement ratings might be due to random chance stimulated further analysis using a multirater kappa coefficient of agreement. An additional seven items had low kappas, ranging from 0.29 to 0.48 and indicating poor agreement among the experts. The findings supported the elimination or revision of eight items. Pros and cons to using both proportion agreement and kappa coefficient analysis are examined.Entities:
Mesh:
Year: 2003 PMID: 12955968 DOI: 10.1177/0193945903252998
Source DB: PubMed Journal: West J Nurs Res ISSN: 0193-9459 Impact factor: 1.967