Literature DB >> 28716383

Nonspecialist Raters Can Provide Reliable Assessments of Procedural Skills.

Oria Mahmood1, Julia Dagnæs2, Sarah Bube3, Malene Rohrsted4, Lars Konge5.   

Abstract

BACKGROUND: Competency-based learning has become a crucial component in medical education. Despite the advantages of competency-based learning, there are still challenges that need to be addressed. Currently, the common perception is that specialist assessment is needed for evaluating procedural skills which is difficult owing to the limited availability of faculty time. The aim of this study was to explore the validity of assessments of video recorded procedures performed by nonspecialist raters.
METHODS: This study was a blinded observational trial. Twenty-three novices (senior medical students) and 9 experienced doctors were video recorded while each performing 2 flexible cystoscopies on patients. The recordings were anonymized and placed in random order and then rated by 2 experienced cystoscopists (specialist raters) and 2 medical students (nonspecialist raters). Flexible cystoscopy was chosen as it is a simple procedural skill that is crucial to master in a resident urology program.
RESULTS: The internal consistency of assessments was high, Cronbach's α = 0.93 and 0.95 for nonspecialist and specialist raters, respectively (p < 0.001 for both correlations). The interrater reliability was significant (p < 0.001) with a Pearson's correlation of 0.77 for the nonspecialists and 0.75 for the specialists. The test-retest reliability showed the biggest difference between the 2 groups, 0.59 and 0.38 for the nonspecialist raters and the specialist raters, respectively (p < 0.001).
CONCLUSION: Our study suggests that nonspecialist raters can provide reliable and valid assessments of video recorded cystoscopies. This could make mastery learning and competency-based education more feasible.
Copyright © 2017 Association of Program Directors in Surgery. All rights reserved.

Entities:  

Keywords:  Medical Knowledge; Practice-Based Learning and Improvement; competency-based learning; nonspecialist raters; procedural skills; rater competencies; rater training; reliable assessments

Mesh:

Year:  2017        PMID: 28716383     DOI: 10.1016/j.jsurg.2017.07.003

Source DB:  PubMed          Journal:  J Surg Educ        ISSN: 1878-7452            Impact factor:   2.891


  3 in total

1.  Evaluating competency in video-assisted thoracoscopic surgery (VATS) lobectomy performance using a novel assessment tool and virtual reality simulation.

Authors:  Katrine Jensen; Henrik Jessen Hansen; René Horsleben Petersen; Kirsten Neckelmann; Henrik Vad; Lars Borgbjerg Møller; Jesper Holst Pedersen; Lars Konge
Journal:  Surg Endosc       Date:  2018-09-17       Impact factor: 4.584

2.  Are we generating more assessments without added value? Surgical trainees' perceptions of and receptiveness to cross-specialty assessment.

Authors:  Sarah Burm; Stefanie S Sebok-Syer; Julie Ann Van Koughnett; Christopher J Watling
Journal:  Perspect Med Educ       Date:  2020-08

3.  Neurosurgical Operative Videos: An Analysis of an Increasingly Popular Educational Resource.

Authors:  Joshua D Knopf; Rahul Kumar; Michael Barats; Paul Klimo; Frederick A Boop; L Madison Michael; Jonathan E Martin; Markus Bookland; David S Hersh
Journal:  World Neurosurg       Date:  2020-09-02       Impact factor: 2.104

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.