Literature DB >> 30916993

Development of a student grading rubric and testing for interrater agreement in a doctor of chiropractic competency program.

Krista Ward, Kathy Kinney, Rhina Patania, Linda Savage, Jamie Motley, Monica Smith.   

Abstract

OBJECTIVE: Clinical competency is integral to the doctor of chiropractic program and is dictated by the Council of Chiropractic Education accreditation standards. These meta-competencies, achieved through open-ended tasks, can be challenging for interrater agreement among multiple graders. We developed and tested interrater agreement of a newly created analytic rubric for a clinical case-based education program.
METHODS: Clinical educators and research staff collaborated on rubric development and testing over four phases. Phase 1 tailored existing institutional rubrics to the new clinical case-based program using a 4-level scale of proficiency. Phase 2 tested the performance of the pilot rubric using 16 senior intern assessments graded by four instructors using pre-established grading keys. Phases 3 and 4 refined and retested rubric versions 1 and 2 on 16 and 14 assessments, respectively.
RESULTS: Exact, adjacent, and pass/fail agreements between six pairs of graders were reported. The pilot rubric achieved 46% average exact, 80% average adjacent, and 63% pass/fail agreements. Rubric version 1 yielded 49% average exact, 86% average adjacent, and 70% pass/fail agreements. Rubric version 2 yielded 60% average exact, 93% average adjacent, and 81% pass/fail agreements.
CONCLUSION: Our results are similar to those of other rubric interrater reliability studies. Interrater reliability improved with later versions of the rubric likely attributable to rater learning and rubric improvement. Future studies should focus on concurrent validity and comparison of student performance with grade point average and national board scores.

Entities:  

Keywords:  Chiropractic; Education; Validation Studies as Topic

Year:  2019        PMID: 30916993      PMCID: PMC6759009          DOI: 10.7899/JCE-18-9

Source DB:  PubMed          Journal:  J Chiropr Educ        ISSN: 1042-5055


  6 in total

1.  The feasibility, reliability, and validity of a post-encounter form for evaluating clinical reasoning.

Authors:  Steven J Durning; Anthony Artino; John Boulet; Jeffrey La Rochelle; Cees Van der Vleuten; Bonnie Arze; Lambert Schuwirth
Journal:  Med Teach       Date:  2012       Impact factor: 3.650

2.  Rubrics for clinical evaluation: objectifying the subjective experience.

Authors:  Julie J Isaacson; Annette S Stacy
Journal:  Nurse Educ Pract       Date:  2008-12-10       Impact factor: 2.281

3.  Rubrics 101: a primer for rubric development in dental education.

Authors:  Jean A O'Donnell; Marnie Oakley; Stephan Haney; Paula N O'Neill; David Taylor
Journal:  J Dent Educ       Date:  2011-09       Impact factor: 2.264

4.  Psychometric characteristics of a write-up assessment form in a medicine core clerkship.

Authors:  Jennifer R Kogan; Judy A Shea
Journal:  Teach Learn Med       Date:  2005       Impact factor: 2.414

5.  Developing a viva exam to assess clinical reasoning in pre-registration osteopathy students.

Authors:  Paul Orrock; Sandra Grace; Brett Vaughan; Rosanne Coutts
Journal:  BMC Med Educ       Date:  2014-09-19       Impact factor: 2.463

6.  Interrater reliability: the kappa statistic.

Authors:  Mary L McHugh
Journal:  Biochem Med (Zagreb)       Date:  2012       Impact factor: 2.313

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.