Literature DB >> 16488013

Representation of ophthalmology concepts by electronic systems: intercoder agreement among physicians using controlled terminologies.

John C Hwang1, Alexander C Yu, Daniel S Casper, Justin Starren, James J Cimino, Michael F Chiang.   

Abstract

OBJECTIVE: To assess intercoder agreement for ophthalmology concepts by 3 physician coders using 5 controlled terminologies (International Classification of Diseases 9, Clinical Modification [ICD9CM]; Current Procedural Terminology, fourth edition; Logical Observation Identifiers, Names, and Codes [LOINC]; Systematized Nomenclature of Medicine, Clinical Terms [SNOMED-CT]; and Medical Entities Dictionary).
DESIGN: Noncomparative case series. PARTICIPANTS: Five complete ophthalmology case presentations selected from a publicly available journal.
METHODS: Each case was parsed into discrete concepts. Electronic or paper browsers were used independently by 3 physician coders to assign a code for every concept in each terminology. A match score representing adequacy of assignment for each concept was assigned on a 3-point scale (0, no match; 1, partial match; 2, complete match). For every concept, the level of intercoder agreement was determined by 2 methods: (1) based on exact code matching with assignment of complete agreement when all coders assigned the same code, partial agreement when 2 coders assigned the same code, and no agreement when all coders assigned different codes, and (2) based on manual review for semantic equivalence of all assigned codes by an independent ophthalmologist to classify intercoder agreement for each concept as complete agreement, partial agreement, or no agreement. Subsequently, intercoder agreement was calculated in the same manner for the subset of concepts judged to have adequate coverage by each terminology, based on receiving a match score of 2 by at least 2 of the 3 coders. MAIN OUTCOME MEASURES: Intercoder agreement in each controlled terminology: complete, partial, or none.
RESULTS: Cases were parsed into 242 unique concepts. When all concepts were analyzed by manual review, the proportion of complete intercoder agreement ranged from 12% (LOINC) to 44% (SNOMED-CT), and the difference in intercoder agreement between LOINC and all other terminologies was statistically significant (P<0.004). When only concepts with adequate terminology were analyzed by manual review, the proportion of complete intercoder agreement ranged from 33% (LOINC) to 64% (ICD9CM), and there were no statistically significant differences in intercoder agreement among any pairs of terminologies.
CONCLUSIONS: The level of intercoder agreement for ophthalmic concepts in existing controlled medical terminologies is imperfect. Intercoder reproducibility is essential for accurate and consistent electronic representation of medical data.

Entities:  

Mesh:

Year:  2006        PMID: 16488013     DOI: 10.1016/j.ophtha.2006.01.017

Source DB:  PubMed          Journal:  Ophthalmology        ISSN: 0161-6420            Impact factor:   12.079


  9 in total

1.  Reliability of SNOMED-CT coding by three physicians using two terminology browsers.

Authors:  Michael F Chiang; John C Hwang; Alexander C Yu; Daniel S Casper; James J Cimino; Justin B Starren
Journal:  AMIA Annu Symp Proc       Date:  2006

2.  I-Maculaweb: A Tool to Support Data Reuse in Ophthalmology.

Authors:  Monica Bonetto; Massimo Nicolò; Roberta Gazzarata; Paolo Fraccaro; Raffaella Rosa; Donatella Musetti; Maria Musolino; Carlo E Traverso; Mauro Giacomini
Journal:  IEEE J Transl Eng Health Med       Date:  2015-12-28       Impact factor: 3.316

3.  Accuracy of the International Classification of Diseases, 9th Revision for Identifying Infantile Eye Disease.

Authors:  Timothy T Xu; Cole E Bothun; Tina M Hendricks; Sasha A Mansukhani; Erick D Bothun; Launia J White; Brian G Mohney
Journal:  Ophthalmic Epidemiol       Date:  2021-11-25

4.  Evaluation of electronic health record implementation in ophthalmology at an academic medical center (an American Ophthalmological Society thesis).

Authors:  Michael F Chiang; Sarah Read-Brown; Daniel C Tu; Dongseok Choi; David S Sanders; Thomas S Hwang; Steven Bailey; Daniel J Karr; Elizabeth Cottle; John C Morrison; David J Wilson; Thomas R Yackel
Journal:  Trans Am Ophthalmol Soc       Date:  2013-09

5.  Pseudostrabismus in the First Year of Life and the Subsequent Diagnosis of Strabismus.

Authors:  Timothy T Xu; Cole E Bothun; Tina M Hendricks; Sasha A Mansukhani; Erick D Bothun; David O Hodge; Brian G Mohney
Journal:  Am J Ophthalmol       Date:  2020-06-10       Impact factor: 5.258

6.  Qualitative analysis of manual annotations of clinical text with SNOMED CT.

Authors:  Jose Antonio Miñarro-Giménez; Catalina Martínez-Costa; Daniel Karlsson; Stefan Schulz; Kirstine Rosenbeck Gøeg
Journal:  PLoS One       Date:  2018-12-27       Impact factor: 3.240

7.  Risks and rewards of increasing patient access to medical records in clinical ophthalmology using OpenNotes.

Authors:  Jake E Radell; Jasmine N Tatum; Chen-Tan Lin; Richard S Davidson; Jonathan Pell; Amber Sieja; Albert Y Wu
Journal:  Eye (Lond)       Date:  2021-10-05       Impact factor: 4.456

8.  Automated UMLS-based comparison of medical forms.

Authors:  Martin Dugas; Fleur Fritz; Rainer Krumm; Bernhard Breil
Journal:  PLoS One       Date:  2013-07-04       Impact factor: 3.240

9.  Mapping the categories of the Swedish primary health care version of ICD-10 to SNOMED CT concepts: rule development and intercoder reliability in a mapping trial.

Authors:  Anna Vikström; Ylva Skånér; Lars-Erik Strender; Gunnar H Nilsson
Journal:  BMC Med Inform Decis Mak       Date:  2007-05-02       Impact factor: 2.796

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.