Literature DB >> 34320964

Intra- and inter-rater reliability of an electronic health record audit used in a chiropractic teaching clinic system: an observational study.

H Stephen Injeyan1, Sheilah Hogg-Johnson2, Sean Abdulla3, Ngai Chow3, Jocelyn Cox3, Anthony Ridding3, Craig Jacobs3.   

Abstract

BACKGROUND: There is a dearth of information about health education clinical file audits in the context of completeness of records and demonstrating program-wide competency achievement. We report on the reliability of an audit instrument used for electronic health record (EHR) audits in the clinics of a chiropractic college in Canada.
METHODS: The instrument is a checklist built within an electronic software application designed to pull data automatically from the EHR. It consists of a combination of 61 objective (n = 20) and subjective (n = 41) elements, representing domains of standards of practice, accreditation and in-house educational standards. Trained auditors provide responses to the elements and the software yields scores indicating the quality of clinical record per file. A convenience sample of 24 files, drawn randomly from the roster of 22 clinicians, were divided into three groups of eight to be completed by one of three auditors in the span of 1 week, at the end of which they were transferred to another auditor. There were four audit cycles; audits from cycles 1 and 4 were used to assess intra-rater (test-retest) reliability and audits from cycles 1, 2 and 3 were used to assess inter-rater reliability. Percent agreement (PA) and Kappa statistics (K) were used as outcomes. Scatter plots and intraclass correlation (ICC) coefficients were used to assess standards of practice, accreditation, and overall audit scores.
RESULTS: Across all 3 auditors test-retest reliability for objective items was PA 89% and K 0.75, and for subjective items PA 82% and K 0.63. In contrast, inter-rater reliability was moderate at PA 82% and K 0.59, and PA 70% and K 0.44 for objective and subjective items, respectively. Element analysis indicated a wide range of PA and K values inter-rater reliability of many elements being rated as poor. ICC coefficient calculations indicated moderate reliability for the domains of standards of practice, accreditation, and overall file scores.
CONCLUSION: The file audit process has substantial test-retest reliability and moderate inter-rater reliability. Recommendations are made to improve reliability outcomes. These include modifying the audit checklist with a view of improving clarity of elements, and enhancing uniformity of auditor responses by increased training aided by preparation of an audit guidebook.
© 2021. The Author(s).

Entities:  

Keywords:  Chiropractic; Electronic health record EHR; File audit; Inter-rater; Reliability; Standards

Mesh:

Year:  2021        PMID: 34320964     DOI: 10.1186/s12913-021-06745-1

Source DB:  PubMed          Journal:  BMC Health Serv Res        ISSN: 1472-6963            Impact factor:   2.655


  18 in total

1.  Chart review. A need for reappraisal.

Authors:  L Wu; C M Ashton
Journal:  Eval Health Prof       Date:  1997-06       Impact factor: 2.651

2.  How Comprehensively Is Evidence-Based Practice Represented in Australian Health Professional Accreditation Documents? A Systematic Audit.

Authors:  Maureen P McEvoy; Mike Crilly; Taryn Young; Jane Farrelly; Lucy Kate Lewis
Journal:  Teach Learn Med       Date:  2016       Impact factor: 2.414

3.  Ensuring high accuracy of data abstracted from patient charts: the use of a standardized medical record as a training tool.

Authors:  Larry Pan; Dean Fergusson; Irwin Schweitzer; Paul C Hebert
Journal:  J Clin Epidemiol       Date:  2005-09       Impact factor: 6.437

Review 4.  Inter-rater reliability of case-note audit: a systematic review.

Authors:  Richard Lilford; Alex Edwards; Alan Girling; Timothy Hofer; Gian Luca Di Tanna; Jane Petty; Jon Nicholl
Journal:  J Health Serv Res Policy       Date:  2007-07

5.  Methods to achieve high interrater reliability in data collection from primary care medical records.

Authors:  Clare Liddy; Miriam Wiens; William Hogg
Journal:  Ann Fam Med       Date:  2011 Jan-Feb       Impact factor: 5.166

Review 6.  Reducing interrater variability and improving health care: a meta-analytical review.

Authors:  Saskia Tuijn; Frans Janssens; Paul Robben; Huub van den Bergh
Journal:  J Eval Clin Pract       Date:  2011-07-04       Impact factor: 2.431

7.  Developing and Testing a Chart Abstraction Tool for ICU Quality Measurement.

Authors:  Jarone Lee; J Matthew Austin; Jungyeon Kim; Paola D Miralles; Haytham M A Kaafarani; Peter J Pronovost; Vipra Ghimire; Sean M Berenholtz; Karen Donelan; Elizabeth Martinez
Journal:  Am J Med Qual       Date:  2018-09-28       Impact factor: 1.852

8.  Evaluation of a Nutrition Care Process-based audit instrument, the Diet-NCP-Audit, for documentation of dietetic care in medical records.

Authors:  Elin Lövestam; Ylva Orrevall; Afsaneh Koochek; Brita Karlström; Agneta Andersson
Journal:  Scand J Caring Sci       Date:  2013-05-05

9.  Promoting Responsible Electronic Documentation: Validity Evidence for a Checklist to Assess Progress Notes in the Electronic Health Record.

Authors:  Jennifer A Bierman; Kathryn Kinner Hufmeyer; David T Liss; A Charlotta Weaver; Heather L Heiman
Journal:  Teach Learn Med       Date:  2017-05-12       Impact factor: 2.414

10.  Examining intra-rater and inter-rater response agreement: a medical chart abstraction study of a community-based asthma care program.

Authors:  Teresa To; Eileen Estrabillo; Chengning Wang; Lisa Cicutto
Journal:  BMC Med Res Methodol       Date:  2008-05-09       Impact factor: 4.615

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.