PURPOSE: To summarize the tool characteristics, sources of validity evidence, methodological quality, and reporting quality for studies of technology-enhanced simulation-based assessments for health professions learners. METHOD: The authors conducted a systematic review, searching MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous reviews through May 2011. They selected original research in any language evaluating simulation-based assessment of practicing and student physicians, nurses, and other health professionals. Reviewers working in duplicate evaluated validity evidence using Messick's five-source framework; methodological quality using the Medical Education Research Study Quality Instrument and the revised Quality Assessment of Diagnostic Accuracy Studies; and reporting quality using the Standards for Reporting Diagnostic Accuracy and Guidelines for Reporting Reliability and Agreement Studies. RESULTS: Of 417 studies, 350 (84%) involved physicians at some stage in training. Most focused on procedural skills, including minimally invasive surgery (N=142), open surgery (81), and endoscopy (67). Common elements of validity evidence included relations with trainee experience (N=306), content (142), relations with other measures (128), and interrater reliability (124). Of the 217 studies reporting more than one element of evidence, most were judged as having high or unclear risk of bias due to selective sampling (N=192) or test procedures (132). Only 64% proposed a plan for interpreting the evidence to be presented (validity argument). CONCLUSIONS: Validity evidence for simulation-based assessments is sparse and is concentrated within specific specialties, tools, and sources of validity evidence. The methodological and reporting quality of assessment studies leaves much room for improvement.
PURPOSE: To summarize the tool characteristics, sources of validity evidence, methodological quality, and reporting quality for studies of technology-enhanced simulation-based assessments for health professions learners. METHOD: The authors conducted a systematic review, searching MEDLINE, EMBASE, CINAHL, ERIC, PsychINFO, Scopus, key journals, and previous reviews through May 2011. They selected original research in any language evaluating simulation-based assessment of practicing and student physicians, nurses, and other health professionals. Reviewers working in duplicate evaluated validity evidence using Messick's five-source framework; methodological quality using the Medical Education Research Study Quality Instrument and the revised Quality Assessment of Diagnostic Accuracy Studies; and reporting quality using the Standards for Reporting Diagnostic Accuracy and Guidelines for Reporting Reliability and Agreement Studies. RESULTS: Of 417 studies, 350 (84%) involved physicians at some stage in training. Most focused on procedural skills, including minimally invasive surgery (N=142), open surgery (81), and endoscopy (67). Common elements of validity evidence included relations with trainee experience (N=306), content (142), relations with other measures (128), and interrater reliability (124). Of the 217 studies reporting more than one element of evidence, most were judged as having high or unclear risk of bias due to selective sampling (N=192) or test procedures (132). Only 64% proposed a plan for interpreting the evidence to be presented (validity argument). CONCLUSIONS: Validity evidence for simulation-based assessments is sparse and is concentrated within specific specialties, tools, and sources of validity evidence. The methodological and reporting quality of assessment studies leaves much room for improvement.
Authors: Anne-Lise D D'Angelo; Drew N Rutherford; Rebecca D Ray; Shlomi Laufer; Calvin Kwan; Elaine R Cohen; Andrea Mason; Carla M Pugh Journal: Am J Surg Date: 2015-01-14 Impact factor: 2.565
Authors: Daniel F Leotta; R Eugene Zierler; Kurt Sansom; Alberto Aliseda; Mark D Anderson; Florence H Sheehan Journal: Ultrasound Med Biol Date: 2018-05-21 Impact factor: 2.998
Authors: Benjamin Zendejas; James W Jakub; Alicia M Terando; Amod Sarnaik; Charlotte E Ariyan; Mark B Faries; Sabino Zani; Heather B Neuman; Nabil Wasif; Jeffrey M Farma; Bruce J Averbook; Karl Y Bilimoria; Douglas Tyler; Mary Sue Brady; David R Farley Journal: Surg Endosc Date: 2016-12-07 Impact factor: 4.584
Authors: Michael K Rooney; Fan Zhu; Erin F Gillespie; Jillian R Gunther; Ryan P McKillip; Matthew Lineberry; Ara Tekian; Daniel W Golden Journal: Int J Radiat Oncol Biol Phys Date: 2018-06-06 Impact factor: 7.038
Authors: R Eugene Zierler; Daniel F Leotta; Kurt Sansom; Alberto Aliseda; Mark D Anderson; Florence H Sheehan Journal: Vasc Endovascular Surg Date: 2016-05-11 Impact factor: 1.089