P Kristina Khanduja1, M Dylan Bould, Viren N Naik, Emily Hladkowicz, Sylvain Boet. 1. 1Department of Anesthesia, Mount Sinai Hospital, Toronto, Ontario, Canada. 2Department of Anesthesia, The Children's Hospital of Eastern Ontario, Ottawa, Ontario, Canada. 3Department of Anesthesthesiology, The Ottawa Hospital, Ottawa, Ontario, Canada. 4Clinician Educator, Royal College of Physicians and Surgeons of Canada, Ottawa, Ontario, Canada. 5Department of Anesthesiology, The Ottawa Hospital Research Institute, Ottawa, Ontario, Canada. 6The Academy for Innovation in Medical Education, Faculty of Medicine, University of Ottawa, Ottawa, Ontario, Canada.
Abstract
OBJECTIVES: We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care specialties. We also describe how simulation is used for performance assessment in this population. DATA SOURCES: Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31, 2013. STUDY SELECTION: All original research describing simulation-based education for independently practicing physicians in anesthesiology, critical care, and emergency medicine was reviewed. DATA EXTRACTION: Data analysis was performed in duplicate with further review by a third author in cases of disagreement until consensus was reached. Data extraction was focused on effectiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. DATA SYNTHESIS: Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evaluated the validity of simulation-based assessment. Thirteen studies (30%) targeted the lower levels of Kirkpatrick's hierarchy with reliance on self-reporting. Simulation was unanimously described as a positive learning experience with perceived impact on clinical practice. Of the 17 remaining studies, 10 used a single group or "no intervention comparison group" design. The majority (n = 17; 44%) were able to demonstrate both immediate and sustained improvements in educational outcomes. Nine studies reported the psychometric properties of simulation-based performance assessment as their sole objective. These predominantly recruited independent practitioners as a convenience sample to establish whether the tool could discriminate between experienced and inexperienced operators and concentrated on a single aspect of validity evidence. CONCLUSIONS: Simulation is perceived as a positive learning experience with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of simulation-based education beyond the individuals toward improved patient care.
OBJECTIVES: We systematically reviewed the effectiveness of simulation-based education, targeting independently practicing qualified physicians in acute care specialties. We also describe how simulation is used for performance assessment in this population. DATA SOURCES: Data source included: DataMEDLINE, Embase, Cochrane Database of Systematic Reviews, Cochrane CENTRAL Database of Controlled Trials, and National Health Service Economic Evaluation Database. The last date of search was January 31, 2013. STUDY SELECTION: All original research describing simulation-based education for independently practicing physicians in anesthesiology, critical care, and emergency medicine was reviewed. DATA EXTRACTION: Data analysis was performed in duplicate with further review by a third author in cases of disagreement until consensus was reached. Data extraction was focused on effectiveness according to Kirkpatrick's model. For simulation-based performance assessment, tool characteristics and sources of validity evidence were also collated. DATA SYNTHESIS: Of 39 studies identified, 30 studies focused on the effectiveness of simulation-based education and nine studies evaluated the validity of simulation-based assessment. Thirteen studies (30%) targeted the lower levels of Kirkpatrick's hierarchy with reliance on self-reporting. Simulation was unanimously described as a positive learning experience with perceived impact on clinical practice. Of the 17 remaining studies, 10 used a single group or "no intervention comparison group" design. The majority (n = 17; 44%) were able to demonstrate both immediate and sustained improvements in educational outcomes. Nine studies reported the psychometric properties of simulation-based performance assessment as their sole objective. These predominantly recruited independent practitioners as a convenience sample to establish whether the tool could discriminate between experienced and inexperienced operators and concentrated on a single aspect of validity evidence. CONCLUSIONS: Simulation is perceived as a positive learning experience with limited evidence to support improved learning. Future research should focus on the optimal modality and frequency of exposure, quality of assessment tools and on the impact of simulation-based education beyond the individuals toward improved patient care.
Authors: Samuel DeMaria; Adam Levine; Philip Petrou; David Feldman; Patricia Kischak; Amanda Burden; Andrew Goldberg Journal: BMJ Simul Technol Enhanc Learn Date: 2017-04-05
Authors: Savino Spadaro; Dan Stieper Karbing; Alberto Fogagnolo; Riccardo Ragazzi; Francesco Mojoli; Luca Astolfi; Antonio Gioia; Elisabetta Marangoni; Stephen Edward Rees; Carlo Alberto Volta Journal: Simul Healthc Date: 2017-12 Impact factor: 1.929