Literature DB >> 32754701

Smartphone application supplements laparoscopic training through simulation by reducing the need for feedback from expert tutors.

Jose Quezada1, Pablo Achurra1, Domenech Asbun2, Karol Polom3,4, Franco Roviello3, Erwin Buckel1, Martin Inzunza1, Gabriel Escalona1, Nicolas Jarufe1, Julian Varas1.   

Abstract

BACKGROUND: Simulation training is a validated, highly effective tool for learning laparoscopy. Feedback plays a crucial role in motor skills training. We present an app to guide students during advanced laparoscopy simulation training and evaluate its effect on training.
METHODS: A smartphone(iOS)-app was developed. A group of trainees were randomized to use the app (YAPP) or not use the app (NAPP). We used blinded analysis with validated rating scales to assess their performance before and after the training. The number of requests for tutor feedback per session was recorded. Finally, the participants in the YAPP group completed a survey about their experience with the app.
RESULTS: Fifteen YAPP and 10 NAPP completed the training program. There were no statistically significant differences between their skills performance scores (P = .338). The number of tutor feedback requests in the YAPP and NAPP was of 4 (3-6) and 13 (10-14) (P < .001), respectively. All participants in the YAPP group found the app was useful.
CONCLUSION: The use of a smartphone app reduces the need for expert tutor feedback without decreasing the degree of skills acquisition.
© 2019 The Authors.

Entities:  

Year:  2019        PMID: 32754701      PMCID: PMC7391878          DOI: 10.1016/j.sopen.2019.05.006

Source DB:  PubMed          Journal:  Surg Open Sci        ISSN: 2589-8450


BACKGROUND

Simulation training through multiple approaches, including virtual reality, bench models, and ex vivo/in vivo labs, has been demonstrated to reduce the learning curve and costs of training in laparoscopic procedures [1]. Expert feed- back plays a crucial role in simulation training. However, securing minimally invasive surgeons as expert mentors to provide feedback can be demanding on available time and resources [2], [3]. A low-cost simulation-based training program was previously developed at our institution to teach advanced laparoscopic skills. This is done through a 14-session program of ex-vivo intracorporeal suturing and hand-sewn anastomosis [4], [5]. The program demonstrated skills acquisition and an elevated degree of skills transfer to the operating room (OR) [6]. Although this type of training program is successful, its efficiency is limited due to the need for expert tutor feedback on every session. Electronic methods of learning (E-learning) increases knowledge acquisition through a more interactive multimedia experience and reduces the costs of learning [7], [8], [9], [10]. Students can organize their training based on personal schedules and learning speeds. Over 85% of residents and medical students re- port using a mobile computing device to obtain knowledge and to study. There is thus an exponential growth in the use of this technology in medical education and clinical practice [11], [12], [13], [14], [15], [16]. In surgical education, videos have been efficiently used for training, supervising, and self-learning [9], [10]. However, the number of available smartphone applications (apps) is small compared to other medical areas [17]. Moreover, information about how an app may impact the need for feedback during skills training is scarce. This study presents the use of a smartphone (iOS-based) app for learning technical aspects of advanced laparoscopy. Furthermore, the study measures how undergoing a validated training program with additional use of the app affects the need for expert tutor feedback when compared to training without the app.

METHODS

App Development. An Apple© smartphone iOS app was developed at our institution to supplement learning during a validated advanced laparoscopy simulation training program (ALSP) [4]. It uses streamed, high definition videos developed to teach advanced laparoscopic techniques such as intracorporeal suturing, the use of surgical energy devices, and ex-vivo small bowel manipulation. During the app development, the technical aspects were discussed and standardized through expert consensus. All videos recorded were of the performance of an expert laparoscopist. Experts were defined as having performed at least 50 laparoscopic gastric bypasses utilizing laparoscopic hand-sewn anastomoses in the last 6 months. Technique amongst the experts was standardized both for video production and for subsequent in-person teaching. Post-production video editing was done with Final Cut Pro 7.0 and Adobe Photoshop After Effects CS2 for subtitle rendering in English and Spanish. With a multi-touch layout, the app allows users to navigate through two main sections. The first one explains the essential techniques needed for intracorporeal suturing and the use of ultrasonic energy devices in small bowel anastomoses. Each video demonstrates in detail the individual components necessary to perform the techniques, such as proper needle positioning, gentle tissue handling, and effective formation of a suture loop for intracorporeal knot tying. All maneuvers are detailed graphically and with written statements explaining each step. The second section offers a series of training sessions that comprise the validated training program (Fig 1). It incorporates the techniques explained in the first section into sequential practice sessions that progressively build upon previously mastered skills. At the completion of all sessions, the trainee is able to successfully perform the task in question: in this case, a laparoscopic hand-sewn jejunojejunostomy (JJO).
Fig 1

A, Two main sections of the app (left), essential techniques needed for intracorporeal suturing (middle) and complete walkthrough of the validated training program (right). B, Steps explained graphically and with written statements.

A, Two main sections of the app (left), essential techniques needed for intracorporeal suturing (middle) and complete walkthrough of the validated training program (right). B, Steps explained graphically and with written statements. Resident Inclusion, Training, and Mentoring With and Without the App. Design: A Quasi-experimental study was performed throughout an entire year (March 2016–March 2017). We recruited and trained a group of general surgery residents in our ALSP [4] (Table 1).
Table 1

Advanced laparoscopy simulation program

Module 1 (3 sessions): Bowel selection and implementation of intracorporeal stay sutures.
Module 2 (3 sessions): Repeat module 1 adding the construction of symmetrical enterotomies.
Module 3 (3 sessions): Repeat modules 1 and 2 adding the closing of the posterior layer with continuous suture.
Module 4 (3 sessions): Repeat modules 1, 2 and 3 adding the closing of the anterior layer of the anastomosis.
Module 5 (2 sessions): Perform 3 full anastomosis.
Advanced laparoscopy simulation program Before initiating the ALSP, all residents were standardized to a same technical level by performing a homogeneous basic laparoscopic training. This training consisted of skills similar to those found in the Fundamentals of Laparoscopic Surgery curriculum, as well as virtual-reality based training of basic laparoscopic skills. We excluded from this experiment residents with previous advanced laparoscopic training or clinical experience in advanced procedures. In this study advanced procedures were defined as those that regularly required the use of intracorporeal suturing. The group of trainees underwent an assessment of their ability to perform a JJO at the beginning of training as well as at the end. As mentioned above, all participants were shown tutorial videos with detailed instructions on how to perform the procedure before the first assessment. After the basic training, a computer-generated sequence was used to randomize the trainees into two groups. The first group underwent the training curriculum with supplemental use of the app (YAPP), while the second group did not use the app (NAPP). If a novice from the YAPP group did not have an iPhone, videos of the app were always available at the simulation center and online through a website platform. The YAPP group was able to consult the app freely, and both groups of trainees could ask for expert feedback and instructions anytime they needed. There was a blinded expert tutor available at all times in the simulation lab to give feedback when it was explicitly requested. This expert tutor also volunteered feedback when it was deemed necessary based on failure to achieve minimum cutoff scores on a global rating scale [4], [6]. The same tutor registered the total number of times feedback was given to each trainee. The initial and final assessments were video recorded and assessed by two blinded expert observers using the validated Objective Structured Assessment of Technical Skills (OSATS) rating scale [18]. The cutoff score to pass the program was set at 20 points (from a maximum of 25 points). When the scores differed by observers, a third blinded expert determined the final score. The success of the program in teaching advanced laparoscopic skills had previously been validated, and these results are published elsewhere [4], [6]. Both groups underwent the same validated 14-session training curriculum mentioned above (Table 1). Training sessions were directed by expert tutors and grouped into modules, which were undertaken in sequence. Each module taught new skills that increased in complexity and were aggregated to the ones previously learned. Beginning a new module was dependent on successfully completing the previous modules by demonstrating competence in the skills involved. All trainees were shown a video prior to the initial assessment that showcased the proper technique for performing a JJO. After this initial showing, the NAPP group no longer had access to the video. The YAPP group did, along with multiple other videos deconstructing the JJO into individual tasks, which mirrored what was taught in person. Statistical Analysis. The comparative analysis between initial and final assessments and between the different groups was done using IBM SPSS version 22 applying nonparametric tests for dependent or independent samples as appropriate (Wilcoxon and Mann Whitney) with a P value defined as <.05 to be considered statistically significant. App Performance Survey. To explore the learner's appraisal of the app, the YAPP group answered a Likert scale survey consisting of five questions about the strengths and weaknesses of the app for learning advanced laparoscopic skills [19]. A score of 1 indicated strong disagreement, and 5, strong agreement. The total survey scores ranged from 5 to a maximum of 25 points. (Table 2). This study was approved by our institutional review board.
Table 2

Agree/disagree survey questions

1. Strongly disagree2: Disagree.3: Neither agree nor disagree.4. Agree.5: Strongly agree
The App correctly describes each of the procedures to be performed.OOOOO
The App allows me to correct common errors throughout the program.OOOOO
The App grants greater autonomy during the program, requiring teaching support only if needed.OOOOO
The App should be a permanent complementary educational resource of the program.OOOOO
The App should be known and downloaded by any surgeon who wants to learn laparoscopy.OOOOO
Agree/disagree survey questions

RESULTS

Twenty-five trainees completed the 14-session training program in an average time of 11 weeks (4–16): 15 YAPP and 10 NAPP. Although they had variable experience in basic laparoscopic cases such as appendectomy and cholecystectomy, no one had previous advanced laparoscopic expertise. Neither group presented significant differences between their previous laparoscopic skill level as evaluated during the pre-training assessment (Fig 2). The mean number of laparoscopic procedures for the NAPP versus YAPP group was 18 (DS 19) versus 23 (DS 20), P = .284. Only two (2/15; 13%) 125 trainees on the YAPP group had no iPhone and had to watch the instructional videos through an iPad located at the simulation center or through a computer at their home.
Fig 2

Residents laparoscopic experience.

Residents laparoscopic experience. Both groups were comparable in mean age and gender and finalized their training with improved OSATS scores (P < .001), reflecting the known effectiveness of the program [4]. YAPP and NAPP had no statistically significant differences in their final scores (P = .338) (Table 3). Moreover, all trainees obtained a score over 20.
Table 3

Results.

YAPP (n = 15)NAPP (n = 10)P
Age mean (SD)27.5 (SD 3.7)26.3 (SD 1).32
Female n (Freq)5 (33%)3 (30%).6
Months of residency mean (SD)12.3 (SD 5.6)10.3 (SD 6.3).4
OSATS PreTest median (range)15 (14–17)15 (9–20).238
OSATS PostTest median (range)23 (23–25)23 (23–25).338
Time postTest mean (SD)1358 (SD 141)1317 (SD 134).48
Results. The YAPP group required less tutor feedback to develop their training. The number of tutor feedback needed to complete the training in the YAPP vs. NAPP was of 4 (3–6) vs. 13 (10–14) (P < .001), respectively (Fig 3).
Fig 3

Box plot comparing number of tutor-feedback needed to complete the training in the NAPP v/s YAPP group.

Box plot comparing number of tutor-feedback needed to complete the training in the NAPP v/s YAPP group. Survey. With a mean global score of 23 points (20–25), the survey showed that all participants deemed the app a useful complement to learning advanced laparoscopy. Furthermore, they reported that the app allowed them greater autonomy in the learning process, requesting support only if they were not able to solve a complicated situation.

DISCUSSION

Simulation has emerged as one of the most important educational tools for surgical training. It shortens learning curves through deliberate practice where residents can learn from their errors without compromising patient safety. Feedback from expert surgeons is a crucial factor to improve skill acquisition [20], [21]. However, expert feedback is often scarce, and tutors are usually not always available because of the high opportunity cost and demanding daily schedules. Most experts have a significant commitment to clinical activity and are not primarily dedicated to simulation-based training. The end result is a limited training program in terms of quality and duration, with few training hours and condensed lessons. This leads to the non-efficient use of simulation labs [22], [23]. Technology has rapidly developed in medical education. E-learning videos and lectures decrease the need for live lectures, permitting flexibility and personalized learning [7]. Mobile apps are convenient and easy to access, optimizing the amount of time a trainee can use to learn. The iOS-app presented in this research is promising because it further maximizes the benefits of technology during training. This tool reduced the need for expert tutor feedback during a validated 14-session laparoscopic training program. It may therefore contribute to the development of more efficient training by decreasing the reliance on human resources. Residents trained with the app needed 3-times less feedback from experts and had comparable skills acquisition to non-app users after finishing the simulation training. All trainees obtained passing scores as established in the original publication of the training program [4], with no measurable downside to using the app. After review of our results, we believe that a probable explanation for the decreased need for feedback is that a majority of the questions a novice has during technical training center around what specific maneuver is next, and how that maneuver is successfully performed. An app containing video tutorials such as the one presented in this manuscript may be able to answer these questions. We believe proper training is impossible to achieve without the use of effective feedback. Therefore, the app is not meant to replace a tutor; it instead optimizes the use of training sessions by decreasing how many times that feedback is required. The app does not provide feedback but instead offers guidance to supplement the instruction given by the trainers. Since the app was not designed to eliminate feedback, we were interested to find that it nonetheless had an effect on the overall use of direct feedback from the instructors. Weaknesses with the survey relate to the fact that it was not previously validated and that its completion was linked to successful completion of the training program (and not merely use of the app). However, the responses uniformly indicate a positive experience from the users. It indicates that trainees find the app to be a beneficial educational tool. Important limitations of this study include the small group size. However, even with this limited cohort size, the analysis of the results showed statistically significant findings. Errors associated with an underpowered study are more likely to be type II errors of false negatives. As such we believe our findings are still valid. Furthermore, the app is being used in a structured, simulation-based training environment. There is thus still a need for use of the app in a more liberal training environment using a variety of training modalities, including use of in-vivo and ex-vivo tissue, scenario-based simulations, etc. The applicability outside of a simulated setting has not been studied yet, but it should be the aim of future studies. A more in depth investigation is needed to understand exactly how the app complemented training to the point that it reduced feedback. We are currently working on this. A new web based and mobile iOS and Android app is being developed. The results of its implementation will be available in a couple of years. To our knowledge, this is the first study to prove that mobile telephone apps with detailed tutorial videos can supplement skills acquisition and reduce the need for expert feedback in resident training. We believe that simulation and mobile technology must be further combined to improve training efficacy in oncoming approaches to surgical education.

AUTHOR CONTRIBUTION

Design of Study: Julian Varas, Pablo Achurra. Data Analysis: Jose Quezada. Writing and revision: Jose Quezada, Pablo Achurra, Domenech Asbun, Karol Polom, Franco Roviello, Nicolas Jarufe, Julian Varas. Data Collection: Gabriel Escalona, Martin Inzunza, Erwin Buckel.

CONFLICT OF INTEREST

None of the authors have any conflicts of interest.

FUNDING SOURCES

This study was financed and supported by Chilean Research Grant FONDECYT INICIO 11170108 from CONICYT and by the Department of Digestive Surgery, Faculty of Medicine, Pontificia Universidad Católica de Chile.
  20 in total

Review 1.  Assessment and feedback in the skills laboratory and operating room.

Authors:  Colin Sugden; Rajesh Aggarwal
Journal:  Surg Clin North Am       Date:  2010-06       Impact factor: 2.741

2.  E-learning vs lecture: which is the best approach to surgical teaching?

Authors:  I Bhatti; K Jones; L Richardson; D Foreman; J Lund; G Tierney
Journal:  Colorectal Dis       Date:  2011-04       Impact factor: 3.788

3.  Teaching subcuticular suturing to medical students: video versus expert instructor feedback.

Authors:  Stuart H Shippey; Tiffany L Chen; Betty Chou; Leise R Knoepp; Craig W Bowen; Victoria L Handa
Journal:  J Surg Educ       Date:  2011-06-25       Impact factor: 2.891

4.  Surgical Learning Application (app) for Smartphones and Tablets: A Potential Tool for Laparoscopic Surgery Teaching Courses.

Authors:  Fabio Paiz; Eduardo A Bonin; Leandro Totti Cavazzola; Antônio Moris Cury; Christiano M P Claus; Danielson Dimbarre; Marcelo de Paula Loureiro
Journal:  Surg Innov       Date:  2015-07-17       Impact factor: 2.058

5.  Allowing New Opportunities in Advanced Laparoscopy Training Using a Full High-Definition Training Box.

Authors:  Pablo Achurra; Antonia Lagos; Ruben Avila; Rodrigo Tejos; Erwin Buckel; Juan Alvarado; Camilo Boza; Nicolas Jarufe; Julian Varas
Journal:  Surg Innov       Date:  2016-10-11       Impact factor: 2.058

Review 6.  Surgical smartphone applications across different platforms: their evolution, uses, and users.

Authors:  Myutan Kulendran; Marcus Lim; Georgia Laws; Andre Chow; Jean Nehme; Ara Darzi; Sanjay Purkayastha
Journal:  Surg Innov       Date:  2014-04-07       Impact factor: 2.058

7.  Simulation-trained junior residents perform better than general surgeons on advanced laparoscopic cases.

Authors:  Camilo Boza; Felipe León; Erwin Buckel; Arnoldo Riquelme; Fernando Crovari; Jorge Martínez; Rajesh Aggarwal; Teodor Grantcharov; Nicolás Jarufe; Julián Varas
Journal:  Surg Endosc       Date:  2016-05-02       Impact factor: 4.584

Review 8.  e-Learning in Surgical Education: A Systematic Review.

Authors:  Nithish Jayakumar; Oliver Brunckhorst; Prokar Dasgupta; Muhammad Shamim Khan; Kamran Ahmed
Journal:  J Surg Educ       Date:  2015-06-22       Impact factor: 2.891

9.  The benefit of repetitive skills training and frequency of expert feedback in the early acquisition of procedural skills.

Authors:  Hans Martin Bosse; Jonathan Mohr; Beate Buss; Markus Krautter; Peter Weyrich; Wolfgang Herzog; Jana Jünger; Christoph Nikendei
Journal:  BMC Med Educ       Date:  2015-02-19       Impact factor: 2.463

10.  Smartphone and medical related App use among medical students and junior doctors in the United Kingdom (UK): a regional survey.

Authors:  Karl Frederick Braekkan Payne; Heather Wharrad; Kim Watts
Journal:  BMC Med Inform Decis Mak       Date:  2012-10-30       Impact factor: 2.796

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.