James E Andrews1, Rachel L Richesson, Jeffrey Krischer. 1. School of Library and Information Science, University of South Florida, 4202 E. Fowler Ave., CIS 1040 , Tampa FL 33620, USA. jandrews@cas.usf.edu
Abstract
OBJECTIVE: To compare consistency of coding among professional SNOMED CT coders representing three commercial providers of coding services when coding clinical research concepts with SNOMED CT. DESIGN: A sample of clinical research questions from case report forms (CRFs) generated by the NIH-funded Rare Disease Clinical Research Network (RDCRN) were sent to three coding companies with instructions to code the core concepts using SNOMED CT. The sample consisted of 319 question/answer pairs from 15 separate studies. The companies were asked to select SNOMED CT concepts (in any form, including post-coordinated) that capture the core concept(s) reflected in the question. Also, they were asked to state their level of certainty, as well as how precise they felt their coding was. MEASUREMENTS: Basic frequencies were calculated to determine raw level agreement among the companies and other descriptive information. Krippendorff's alpha was used to determine a statistical measure of agreement among the coding companies for several measures (semantic, certainty, and precision). RESULTS: No significant level of agreement among the experts was found. CONCLUSION: There is little semantic agreement in coding of clinical research data items across coders from 3 professional coding services, even using a very liberal definition of agreement.
OBJECTIVE: To compare consistency of coding among professional SNOMED CT coders representing three commercial providers of coding services when coding clinical research concepts with SNOMED CT. DESIGN: A sample of clinical research questions from case report forms (CRFs) generated by the NIH-funded Rare Disease Clinical Research Network (RDCRN) were sent to three coding companies with instructions to code the core concepts using SNOMED CT. The sample consisted of 319 question/answer pairs from 15 separate studies. The companies were asked to select SNOMED CT concepts (in any form, including post-coordinated) that capture the core concept(s) reflected in the question. Also, they were asked to state their level of certainty, as well as how precise they felt their coding was. MEASUREMENTS: Basic frequencies were calculated to determine raw level agreement among the companies and other descriptive information. Krippendorff's alpha was used to determine a statistical measure of agreement among the coding companies for several measures (semantic, certainty, and precision). RESULTS: No significant level of agreement among the experts was found. CONCLUSION: There is little semantic agreement in coding of clinical research data items across coders from 3 professional coding services, even using a very liberal definition of agreement.
Authors: S Bakken; J J Cimino; R Haskell; R Kukafka; C Matsumoto; G K Chan; S M Huff Journal: J Am Med Inform Assoc Date: 2000 Nov-Dec Impact factor: 4.497
Authors: I Sim; P Gorman; R A Greenes; R B Haynes; B Kaplan; H Lehmann; P C Tang Journal: J Am Med Inform Assoc Date: 2001 Nov-Dec Impact factor: 4.497
Authors: Diane Montella; Steven H Brown; Peter L Elkin; James C Jackson; S Trent Rosenbloom; Dietlind Wahner-Roedler; Gail Welsh; Bryan Cotton; Oscar D Guillamondegui; Henry Lew; Katherine H Taber; Larry A Tupler; Rodney Vanderploeg; Theodore Speroff Journal: AMIA Annu Symp Proc Date: 2011-10-22
Authors: James E Andrews; Timothy B Patrick; Rachel L Richesson; Hana Brown; Jeffrey P Krischer Journal: J Biomed Inform Date: 2008-02-05 Impact factor: 6.317