Niklas Kahr Rasmussen1,2, Jonathan Frederik Carlsen3,4, Beth Hærstedt Olsen5, Dorte Stærk3, Trine-Lise Lambine3, Birthe Henriksen3, Maja Rasmussen6, Mattis Jørgensen7, Elisabeth Albrecht-Beste4,8, Lars Konge4,9, Michael Bachmann Nielsen3,4, Leizl Joy Nayahangan9. 1. Department of Radiology, Copenhagen University Hospital, Rigshospitalet, Blegdamsvej 9, DK-2100, Copenhagen, Denmark. niklaskahr@gmail.com. 2. Department of Clinical Medicine, University of Copenhagen, Copenhagen, Denmark. niklaskahr@gmail.com. 3. Department of Radiology, Copenhagen University Hospital, Rigshospitalet, Blegdamsvej 9, DK-2100, Copenhagen, Denmark. 4. Department of Clinical Medicine, University of Copenhagen, Copenhagen, Denmark. 5. Ultrasound Section, Department of Nuclear Medicine and Functional Imaging, Copenhagen University Hospital Hvidovre, Hvidovre, Denmark. 6. Department of Radiology, Copenhagen University Hospital, Bispebjerg and Frederiksberg Hospital, Copenhagen, Denmark. 7. Department of Diagnostic Imaging, Copenhagen University Hospital, North Zealand Hospital, Hillerød, Denmark. 8. Department of Clinical Physiology, Nuclear Medicine and PET, Copenhagen University Hospital, Rigshospitalet, Copenhagen, Denmark. 9. Copenhagen Academy for Medical Education and Simulation, Center for HR and Education, The Capital Region of Denmark, Copenhagen, Denmark.
Abstract
OBJECTIVES: To investigate the validity of the Interventional Ultrasound Skills Evaluation (IUSE) tool for assessment of procedural competence in ultrasound-guided procedures in a clinical environment, including a pass/fail score. METHODS: Novices and experienced radiologists were recruited from four hospitals and were observed and assessed while performing ultrasound-guided procedures. Performances were assessed using the IUSE tool by two independent raters. Validity evidence was gathered in accordance with Messick's framework: response process was ensured by standardisation of written rater instructions. Internal structure was explored using Cronbach's alpha for internal consistency reliability; inter-rater reliability was calculated as Pearson's r independently across all ratings, and test-retest reliability was reported using Cronbach's alpha. Relationship to other variables was investigated by comparing performances of the participants in each group. Consequences evidence was explored by calculating a pass/fail standard using the contrasting groups method. RESULTS: Six novices and twelve experienced radiologists were enrolled. The IUSE tool had high internal consistency (Cronbach's alpha = 0.96, high inter-rater reliability (Pearson's r = 0.95), and high test-retest reliability (Cronbach's alpha = 0.98), and the mean score was 33.28 for novices and 59.25 for experienced with a highly significant difference (p value < 0.001). The pass/fail score was set at 55 resulting in no false positives or false negatives. CONCLUSIONS: Validity evidence from multiple sources supports the use of the IUSE tool for assessment of competence in ultrasound-guided procedures in a clinical environment and its use in high-stakes assessment such as certification. A credible pass/fail criterion was established to inform decision-making. KEY POINTS: • A multi-site validity investigation established that the Interventional Ultrasound Skills Evaluation (IUSE) tool can be used to assess procedural competence in ultrasound-guided procedures. • Validity evidence was gathered according to Messick's framework validity from the following sources: response process, internal structure, relationship to other variables, and consequences evidence. • The IUSE tool can be used for both formative and summative assessment, and a credible pass/fail score was established to help inform decision-making such as certification.
OBJECTIVES: To investigate the validity of the Interventional Ultrasound Skills Evaluation (IUSE) tool for assessment of procedural competence in ultrasound-guided procedures in a clinical environment, including a pass/fail score. METHODS: Novices and experienced radiologists were recruited from four hospitals and were observed and assessed while performing ultrasound-guided procedures. Performances were assessed using the IUSE tool by two independent raters. Validity evidence was gathered in accordance with Messick's framework: response process was ensured by standardisation of written rater instructions. Internal structure was explored using Cronbach's alpha for internal consistency reliability; inter-rater reliability was calculated as Pearson's r independently across all ratings, and test-retest reliability was reported using Cronbach's alpha. Relationship to other variables was investigated by comparing performances of the participants in each group. Consequences evidence was explored by calculating a pass/fail standard using the contrasting groups method. RESULTS: Six novices and twelve experienced radiologists were enrolled. The IUSE tool had high internal consistency (Cronbach's alpha = 0.96, high inter-rater reliability (Pearson's r = 0.95), and high test-retest reliability (Cronbach's alpha = 0.98), and the mean score was 33.28 for novices and 59.25 for experienced with a highly significant difference (p value < 0.001). The pass/fail score was set at 55 resulting in no false positives or false negatives. CONCLUSIONS: Validity evidence from multiple sources supports the use of the IUSE tool for assessment of competence in ultrasound-guided procedures in a clinical environment and its use in high-stakes assessment such as certification. A credible pass/fail criterion was established to inform decision-making. KEY POINTS: • A multi-site validity investigation established that the Interventional Ultrasound Skills Evaluation (IUSE) tool can be used to assess procedural competence in ultrasound-guided procedures. • Validity evidence was gathered according to Messick's framework validity from the following sources: response process, internal structure, relationship to other variables, and consequences evidence. • The IUSE tool can be used for both formative and summative assessment, and a credible pass/fail score was established to help inform decision-making such as certification.
Authors: Richard E Hawkins; Catherine M Welcher; Eric S Holmboe; Lynne M Kirk; John J Norcini; Kenneth B Simons; Susan E Skochelak Journal: Med Educ Date: 2015-11 Impact factor: 6.251
Authors: D Strobel; T Bernatik; W Blank; U Will; A Reichel; M Wüstner; V Keim; D Schacherer; A P Barreiros; G Kunze; D Nürnberg; A Ignee; E Burmester; A A Bunk; M Friedrich-Rust; E Froehlich; A Schuler; C Jenssen; W Bohle; M Mauch; K Dirks; J Kaemmer; C Pachmann; J Stock; M Hocke; A Kendel; C Schmidt; C Jakobeit; H Kinkel; W Heinz; G Hübner; M Pichler; T Müller Journal: Ultraschall Med Date: 2015-04-14 Impact factor: 6.548
Authors: Nanna Jo Borgersen; Therese M H Naur; Stine M D Sørensen; Flemming Bjerrum; Lars Konge; Yousif Subhi; Ann Sofia S Thomsen Journal: Ann Surg Date: 2018-06 Impact factor: 12.969
Authors: Jocelyn Lockyer; Carol Carraccio; Ming-Ka Chan; Danielle Hart; Sydney Smee; Claire Touchie; Eric S Holmboe; Jason R Frank Journal: Med Teach Date: 2017-06 Impact factor: 3.650
Authors: Patrick Cervini; Gina K Hesley; Rodney L Thompson; Priya Sampathkumar; John M Knudsen Journal: AJR Am J Roentgenol Date: 2010-10 Impact factor: 3.959