Literature DB >> 34161986

Lessons Learned for Identifying and Annotating Permissions in Clinical Consent Forms.

Elizabeth E Umberfield1,2, Yun Jiang3, Susan H Fenton4, Cooper Stansbury5,6, Kathleen Ford3, Kaycee Crist7, Sharon L R Kardia8, Andrea K Thomer9, Marcelline R Harris3.   

Abstract

BACKGROUND: The lack of machine-interpretable representations of consent permissions precludes development of tools that act upon permissions across information ecosystems, at scale.
OBJECTIVES: To report the process, results, and lessons learned while annotating permissions in clinical consent forms.
METHODS: We conducted a retrospective analysis of clinical consent forms. We developed an annotation scheme following the MAMA (Model-Annotate-Model-Annotate) cycle and evaluated interannotator agreement (IAA) using observed agreement (A o), weighted kappa (κw ), and Krippendorff's α.
RESULTS: The final dataset included 6,399 sentences from 134 clinical consent forms. Complete agreement was achieved for 5,871 sentences, including 211 positively identified and 5,660 negatively identified as permission-sentences across all three annotators (A o = 0.944, Krippendorff's α = 0.599). These values reflect moderate to substantial IAA. Although permission-sentences contain a set of common words and structure, disagreements between annotators are largely explained by lexical variability and ambiguity in sentence meaning.
CONCLUSION: Our findings point to the complexity of identifying permission-sentences within the clinical consent forms. We present our results in light of lessons learned, which may serve as a launching point for developing tools for automated permission extraction. Thieme. All rights reserved.

Entities:  

Mesh:

Year:  2021        PMID: 34161986      PMCID: PMC8221844          DOI: 10.1055/s-0041-1730032

Source DB:  PubMed          Journal:  Appl Clin Inform        ISSN: 1869-0327            Impact factor:   2.762


  8 in total

1.  Inductive creation of an annotation schema for manually indexing clinical conditions from emergency department reports.

Authors:  Wendy W Chapman; John N Dowling
Journal:  J Biomed Inform       Date:  2005-08-22       Impact factor: 6.317

2.  Association of Electronic Surgical Consent Forms With Entry Error Rates.

Authors:  J Jeffery Reeves; Kristin L Mekeel; Ruth S Waterman; Lisa R Rhodes; Brian J Clay; Bryan M Clary; Christopher A Longhurst
Journal:  JAMA Surg       Date:  2020-08-01       Impact factor: 14.766

3.  Replacing Paper Informed Consent with Electronic Informed Consent for Research in Academic Medical Centers: A Scoping Review.

Authors:  Cindy Chen; Pou-I Lee; Kevin J Pain; Diana Delgado; Curtis L Cole; Thomas R Campion
Journal:  AMIA Jt Summits Transl Sci Proc       Date:  2020-05-30

4.  The measurement of observer agreement for categorical data.

Authors:  J R Landis; G G Koch
Journal:  Biometrics       Date:  1977-03       Impact factor: 2.571

5.  A Biomedical Research Permissions Ontology: Cognitive and Knowledge Representation Considerations.

Authors:  Jihad Obeid; Davera Gabriel; Iain Sanderson
Journal:  Proc Gov Technol Inf Policies (2010)       Date:  2010-12

6.  An investigation of the efficacy of electronic consenting interfaces of research permissions management system in a hospital setting.

Authors:  Kapil Chalil Madathil; Reshmi Koikkara; Jihad Obeid; Joel S Greenstein; Iain C Sanderson; Katrina Fryar; Jay Moskowitz; Anand K Gramopadhye
Journal:  Int J Med Inform       Date:  2013-06-10       Impact factor: 4.046

7.  Readability of Invasive Procedure Consent Forms.

Authors:  Adam E M Eltorai; Syed S Naqvi; Soha Ghanian; Craig P Eberson; Arnold-Peter C Weiss; Christopher T Born; Alan H Daniels
Journal:  Clin Transl Sci       Date:  2015-12-17       Impact factor: 4.689

8.  Electronic Health Records in Danish Home Care and Nursing Homes: Inadequate Documentation of Care, Medication, and Consent.

Authors:  Morten Hertzum
Journal:  Appl Clin Inform       Date:  2021-01-13       Impact factor: 2.342

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.