Jesper Kørup Jensen1, Liv Dyre2,3, Mattis Enggaard Jørgensen4, Lisbeth Anita Andreasen3, Martin Grønnebaek Tolsgaard3. 1. Department of Anesthesiology and Intensive Care, Odense University Hospital, Odense, Denmark. 2. Center of Fetal Medicine, Department of Obstetrics, Copenhagen University Hospital Rigshospitalet, Copenhagen, Denmark. 3. Copenhagen Academy for Medical Education and Simulation, Rigshospitalet, Copenhagen, Denmark. 4. Department of Radiology, Naestved, Slagelse, Ringsted (NSR) Hospitals, Denmark.
Abstract
OBJECTIVES: The aim of this study was to examine the validity of a simulator test designed to evaluate focused assessment with sonography for trauma (FAST) skills. METHODS: Participants included a group of ultrasound novices (n = 25) and ultrasound experts (n = 10). All participants had their FAST skills assessed using a virtual reality ultrasound simulator. Procedural performance on the 4 FAST windows was assessed by automated simulator metrics, which received a passing or failing score. The validity evidence for these simulator metrics was examined by a stepwise approach according to the Standards for Educational and Psychological Testing. Metrics with validity evidence were included in a simulator test, and the reliability of test scores was determined. Finally, a pass/fail level for procedural performance was established. RESULTS: Of the initial 55 metrics, 34 (61.8%) had validity evidence (P < .01). A simulator test was constructed based on the 34 metrics with established validity evidence, and test scores were calculated as percentages of the maximum score. The median simulator test scores were 14.7% (range, 0%-47.1%) and 94.1% (range, 94.1%-100%) for novices and experts, respectively (P < .001). The pass/fail level was determined to be 79.7%. CONCLUSIONS: The performance of FAST examinations can be assessed in a simulated setting using defensible performance standards, which have both good reliability and validity.
OBJECTIVES: The aim of this study was to examine the validity of a simulator test designed to evaluate focused assessment with sonography for trauma (FAST) skills. METHODS:Participants included a group of ultrasound novices (n = 25) and ultrasound experts (n = 10). All participants had their FAST skills assessed using a virtual reality ultrasound simulator. Procedural performance on the 4 FAST windows was assessed by automated simulator metrics, which received a passing or failing score. The validity evidence for these simulator metrics was examined by a stepwise approach according to the Standards for Educational and Psychological Testing. Metrics with validity evidence were included in a simulator test, and the reliability of test scores was determined. Finally, a pass/fail level for procedural performance was established. RESULTS: Of the initial 55 metrics, 34 (61.8%) had validity evidence (P < .01). A simulator test was constructed based on the 34 metrics with established validity evidence, and test scores were calculated as percentages of the maximum score. The median simulator test scores were 14.7% (range, 0%-47.1%) and 94.1% (range, 94.1%-100%) for novices and experts, respectively (P < .001). The pass/fail level was determined to be 79.7%. CONCLUSIONS: The performance of FAST examinations can be assessed in a simulated setting using defensible performance standards, which have both good reliability and validity.
Authors: Mia Louise Østergaard; Lars Konge; Niklas Kahr; Elisabeth Albrecht-Beste; Michael Bachmann Nielsen; Kristina Rue Nielsen Journal: Diagnostics (Basel) Date: 2019-05-06
Authors: Elaine Situ-LaCasse; Josie Acuña; Dang Huynh; Richard Amini; Steven Irving; Kara Samsel; Asad E Patanwala; David E Biffar; Srikar Adhikari Journal: BMC Med Educ Date: 2021-03-20 Impact factor: 2.463