Elizabeth A Henneman1, Helene Cunningham, Donald L Fisher, Karen Plotkin, Brian H Nathanson, Joan P Roche, Jenna L Marquard, Cheryl A Reilly, Philip L Henneman. 1. Elizabeth A. Henneman, PhD, RN, is an associate professor of nursing at the University of Massachusetts Amherst. Her research focuses on the nurse's role in the recovery of error and the early recognition of adverse events. She has developed numerous simulation scenarios and evaluation instruments that have been used in both research and educational settings with nursing students, novice nurses, and experienced nurses. Helene Cunningham, MS, RN, is a clinical assistant professor of nursing at the University of Massachusetts Amherst and serves as the director of the state-of-the-art Nursing Clinical Simulation Laboratory on Amherst's campus. She is an expert in training nurse educators, staff nurses, and students in using simulation technology. Donald L. Fisher, PhD, is the department head and professor of mechanical and industrial engineering at the University of Massachusetts Amherst. His research is directed at uncovering why human operators make errors in high-stress environments and evaluating existing interfaces designed both to reduce errors and to the performance of human operators. Karen Plotkin, PhD, RN, is a clinical assistant professor of nursing at the University of Massachusetts. She has expertise in the development and evaluation of simulation scenarios for undergraduate nursing students. Brian H. Nathanson, PhD, is the cofounder and chief executive officer of OptiStatim, LLC. His research focuses on critical care medicine, sepsis, benchmarking, and patient safety. Joan P. Roche, PhD, RN, GCNS-BC, is an associate clinical professor of nursing at the University of Massachusetts Amherst. Her program of research is focused on the relationship between the healthcare system and patient outcomes. She is also involved in studies examining the use of human patient simulation in nursing education. Jenna L. Marquard, PhD, is an associate professor of industrial engineering at the University of Massachusetts Amherst. Her research interests include developing beha
Abstract
INTRODUCTION: Human patient simulation has been widely adopted in healthcare education despite little research supporting its efficacy. The debriefing process is central to simulation education, yet alternative evaluation methods to support providing optimal feedback to students have not been well explored. Eye tracking technology is an innovative method for providing objective evaluative feedback to students after a simulation experience. The purpose of this study was to compare 3 forms of simulation-based student feedback (verbal debrief only, eye tracking only, and combined verbal debrief and eye tracking) to determine the most effective method for improving student knowledge and performance. METHODS: An experimental study using a pretest-posttest design was used to compare the effectiveness of 3 types of feedback. The subjects were senior baccalaureate nursing students in their final semester enrolled at a large university in the northeast United States. Students were randomly assigned to 1 of the 3 intervention groups. RESULTS: All groups performed better in the posttest evaluation than in the pretest. Certain safety practices improved significantly in the eye tracking-only group. These criteria were those that required an auditory and visual comparison of 2 artifacts such as "Compares patient stated name with name on ID band." CONCLUSIONS: Eye tracking offers a unique opportunity to provide students with objective data about their behaviors during simulation experiences, particularly related to safety practices that involve the comparison of patient stated data to an artifact such as an ID band. Despite the limitations of current eye tracking technology, there is significant potential for the use of this technology as a method for the study and evaluation of patient safety practices.
RCT Entities:
INTRODUCTION:Humanpatient simulation has been widely adopted in healthcare education despite little research supporting its efficacy. The debriefing process is central to simulation education, yet alternative evaluation methods to support providing optimal feedback to students have not been well explored. Eye tracking technology is an innovative method for providing objective evaluative feedback to students after a simulation experience. The purpose of this study was to compare 3 forms of simulation-based student feedback (verbal debrief only, eye tracking only, and combined verbal debrief and eye tracking) to determine the most effective method for improving student knowledge and performance. METHODS: An experimental study using a pretest-posttest design was used to compare the effectiveness of 3 types of feedback. The subjects were senior baccalaureate nursing students in their final semester enrolled at a large university in the northeast United States. Students were randomly assigned to 1 of the 3 intervention groups. RESULTS: All groups performed better in the posttest evaluation than in the pretest. Certain safety practices improved significantly in the eye tracking-only group. These criteria were those that required an auditory and visual comparison of 2 artifacts such as "Compares patient stated name with name on ID band." CONCLUSIONS: Eye tracking offers a unique opportunity to provide students with objective data about their behaviors during simulation experiences, particularly related to safety practices that involve the comparison of patient stated data to an artifact such as an ID band. Despite the limitations of current eye tracking technology, there is significant potential for the use of this technology as a method for the study and evaluation of patient safety practices.
Authors: Cora Griffin; Abdullatif Aydın; Oliver Brunckhorst; Nicholas Raison; Muhammad Shamim Khan; Prokar Dasgupta; Kamran Ahmed Journal: World J Urol Date: 2019-09-17 Impact factor: 4.226