Literature DB >> 35520372

Hands-On Time in Simulation-Based Ultrasound Training - A Dose-Related Response Study.

Oria Mahmood1, Rikke Jørgensen2, Kristina Nielsen3, Lars Konge4, Lene Russell5.   

Abstract

Purpose Point of care ultrasound (POCUS) is widely used, but the sensitivity and specificity of the findings are highly user-dependent. There are many different approaches to ultrasound training. The aim of this study was to explore the effects of hands-on practice when learning POCUS. Methods Junior doctors with no or limited ultrasound experience were included in the study and divided into three groups. They all completed a Focused Assessment with Sonography for Trauma (FAST) course with different amounts of hands-on practice: 40 minutes (n=67), 60 minutes (n=12), and 90 minutes of hands-on time (n=27). By the end of the course, they all completed a previously validated test. Results More hands-on time improved the mean test scores and decreased the test time. The scores of the 40-, 60-, and 90-minute groups were 11.6 (SD 2.1), 12.8 (SD 2.5), and 13.7 (SD 2.5), respectively (p<0.001). The 90-minute group completed the test significantly faster than the other two groups (20 versus 26 minutes, p=0.003). A large inter-individual variation was seen. Conclusion The necessary amount of hands-on training is unknown. This study demonstrates that performance increases with prolonged hands-on time but the inter-individual variation among trainees is very large, thereby making it impossible to define the "optimal" time. This supports the use of the concept of mastery learning where each individual trainee can continue training until proficiency is reached. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial-License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/).

Entities:  

Keywords:  AREAS; METHODS & TECHNIQUES; STRUCTURES & SYSTEMS; THEMES; education; medical; ultrasound

Year:  2022        PMID: 35520372      PMCID: PMC9064453          DOI: 10.1055/a-1795-5138

Source DB:  PubMed          Journal:  Ultrasound Int Open        ISSN: 2199-7152


Introduction

The use of point of care ultrasound (POCUS) or bedside ultrasound has expanded across many medical and surgical specialties 1 and is viewed as an essential skill for the new generation of physicians 2 . However, the sensitivity and specificity of ultrasound findings are highly operator-dependent 1 3 and exposure to ultrasound during clinical training may give a false sense of competence 4 . This could potentially put patients at risk. In fact, the Emergency Care Research Institute (ECRI) identifies incorrect usage of ultrasound as one of the top ten health technology hazards. ECRI therefore recommends that protocols for the training and examination of ultrasound users should follow established guidelines and recommendations 5 . One of the most common POCUS protocols is the Focused Assessment with Sonography for Trauma (FAST), which has become an established tool in trauma management 6 . The FAST examination consists of four simple standardized sonographic views: pericardial, perihepatic, perisplenic, and pelvic. The purpose is to identify the presence of intraperitoneal or pericardial free fluid (which in the trauma setting most often would be due to bleeding). Previous studies have reported steep learning curves when learning POCUS including FAST 7 8 . Ultrasound training curricula are typically based on either performing a certain number of scans or training for a pre-specified amount of time 9 10 11 . However, the amount of hands-on training necessary to ensure competency to perform a certain POCUS protocol is not known. Therefore, we designed this study with the aim of further exploring the amount of hands-on practice necessary to learn a specific POCUS protocol - the FAST exam.

Materials and Method

Junior doctors (<12 months of experience as medical doctors) completing their first postgraduate year in different departments were included in this study. They all had no or very limited ultrasound experience (no experience with unsupervised ultrasound scans; none had performed more than five scans with supervision). They all attended an ultrasound course in a controlled environment at a University Hospital Simulation Center. The course consisted of a theoretical lecture followed by hands-on practice on two different virtual reality simulators and up to three healthy volunteers. All trainees performed the same examinations on the same two simulators. All of the training sessions were conducted by the same team, which included two specialist doctors (one anesthesiologist and one radiologist) with many years of clinical ultrasound experience (including the FAST protocol). The hands-on training was performed under close supervision. Before starting the course, the junior doctors were separated into three groups. They all completed the same course except with different amounts of time of hands-on practice: the first group had 40 minutes, the second group had 60 minutes, and the last group had 90 minutes of hands-on practice ( Table 1 ). During the practice, the trainees performed several supervised FAST examinations on healthy volunteers followed by training on two different virtual reality simulators: Schallware (Ultrasound simulator station-128) and the Simbionix U/S mentor (3D systems). Both simulators can be used for diagnostic abdominal ultrasound training, the main difference being that the Schallware simulator uses recorded films of real patient scans whereas the Simbionix simulator uses computer-generated illustrations 12 . The simulators allow trainees to perform actual real-time dynamic FAST examinations with positive findings (i. e., free fluid) on lifelike mannequins with a mock ultrasound probe.

Table 1 Baseline and course characteristics.

40-minute group60-minute group90-minute group
Number 671227
Age (years) 28 (26–33)28 (26–33)28.3 (25–33)
Female (%) 43 (64%)8 (67%)17 (74%)
Course details:
Theoretical introduction (minutes) 102020
Hands-on time volunteers (minutes) 203045
Hands-on time simulators (minutes) 203045

Data are expressed as number (percentages) or median (interquartile range for age the range). All participants had < 12 months of experience as medical doctors and had no experience with unsupervised ultrasound scans; none had performed more than 5 scans with supervision.

Table 1 Baseline and course characteristics. Data are expressed as number (percentages) or median (interquartile range for age the range). All participants had < 12 months of experience as medical doctors and had no experience with unsupervised ultrasound scans; none had performed more than 5 scans with supervision.

The test

A validated test in FAST 13 was used to compare the performance results between the three groups. The tests were performed on the Schallware ultrasound simulator and consisted of five consecutive complete FAST examinations, i. e., 20 different ultrasound views giving a maximum of 20 points, with a score of 14 points being required to pass the test (previously established pass/fail score). The novices had a maximum of six minutes to complete each FAST examination in the test.

Data analysis

The test scores and times of the three groups were compared using Analysis of Variance (ANOVA) followed by pair-wise comparisons using independent samples t-tests. Levene’s test was used to compare the variances in the three groups. Differences were considered statistically significant when P<0.05. The statistical analyses were done using IBM SPSS statistics software (IBM Corp. Released 2011, version 20.0). We used GraphPad Prism 6.00 for OS (USA) to create the graphs.

Ethical considerations

All participants were given verbal and written information about the study (and all signed an informed consent form). None of the junior doctors included in the study were working at the same institution as the authors. The collected data was anonymized. The study was exempt from ethical approval according to Danish legislation.

Results

In total, 106 junior doctors ( Table 1 ) attended the course and performed the test. The mean test score increased with a longer hands-on training time: 11.6 (SD 2.1) in the 40-minute group, 12.8 (SD 2.5) in the 60-minute group, and 13.7 (SD 2.5) in the 90-minute group (p<0.001) ( Table 2 ).

Table 2 Test performance after participation in course *.

40-minute group60-minute group90-minute group
Mean score 11.6 (2.1)12.8 (2.5)13.7 (2.5)
Median score 12 (10–13)12.5 (10–16)13 (12–15)
Range 7–179–1610–19
Mean time (min) 26.3 (5.5)25.9 (6.6)20.3 (4.3)
Mean score/minute (1/min) 0.46 (0.15)0.5 (0.19)0.7 (0.21)
Passed test* 12 (18%)4 (33%)11 (41%)

Data are expressed as mean (standard deviation), median (interquartile range), or number (percentages). *Participants all performed a validated test with a maximum score of 20 and with a pass/fail score of 14 (15).

Table 2 Test performance after participation in course *. Data are expressed as mean (standard deviation), median (interquartile range), or number (percentages). *Participants all performed a validated test with a maximum score of 20 and with a pass/fail score of 14 (15). There were no significant differences between the 40-minute group and the 60-minute group (p=0.09) or between the 60-minute group and the 90-minute group (p=0.26). However, there was a significant difference in mean test score between the 40-minute group and the 90-minute group (p<0.001). In the 40-minute group, 19% of the junior doctors passed the test, in the 60-minute group, 33% passed, and in the 90-minute group, 41% passed ( Table 2 ), p=0.05. Despite a higher mean score and a higher pass percentage in the group with more hands-on time, the interindividual variation was very large in all three groups ( Fig. 1 ).
Fig. 1

Scatter plot for the three groups. The dots represent every novice score within the groups. The thickened line marks the mean test score for each group.

Scatter plot for the three groups. The dots represent every novice score within the groups. The thickened line marks the mean test score for each group. As seen in Table 2 , the total time the participants spent on performing the test was lower in the 90-minute group compared to the other two groups (20 versus 26 minutes, p=0.003). Accordingly, the 90-minute group had a significantly higher test score/minute (0.7 versus 0.5, p<0.001), but as can be seen in Fig. 2 , the variation within the group was large.
Fig. 2

Test score per minute in the three groups with different hands-on time. The central bar in the box represents the median score per minute, the box represents the interquatile range, and the whiskers represent the range.

Test score per minute in the three groups with different hands-on time. The central bar in the box represents the median score per minute, the box represents the interquatile range, and the whiskers represent the range.

Discussion

This study showed that more hands-on time led to a higher mean test score, i. e., an increase in hands-on training led to better overall performance in a shorter amount of time. This is no surprise. However, the important point demonstrated by this study is that even though the mean test score increases among the doctors when increasing hands-on training time, the interindividual variation is very large ( Fig. 1 and Fig. 2 ). It would be expected that the intraindividual variation is substantial for trainees in the first stages of their training as proposed by Fitts and Posner’s model of skills acquisition, where performance is characterized by three sequential stages: 1) cognitive; during this stage the trainee develops a mental picture and fuller understanding of the required action, 2) associative; during this stage the trainee physically practices the action and 3) autonomous; during this stage the trainee learns to carry out the skill with little conscious effort 14 . Initially there is rapid improvement in performance, followed by a more gradual slower phase. The speed with which the individual learner passes through these phases will vary greatly. As shown in this study, time itself does not ensure proficiency, but it does improve performance in general. In general, training in the medical field is expensive and learning programs should be both competence-generating and cost-effective 15 . Furthermore, insufficient training, especially in user-dependent modalities such as ultrasound, can be a hazard. The best way to ensure that trainees achieve acceptable levels of performance and diagnostic accuracy remains controversial 7 8 9 16 . Several factors have been shown to facilitate the learning of motor skills, such as observational learning, external focus of attention, feedback, and self-controlled practice. There are a variety of reasons why these variables are effective. However, no single factor has been shown to be superior 10 11 . It is relatively clear that both observing and physically practicing a task are necessary to learn it 17 . The efficacy of skills training using simulation has been well-documented 18 and studies also show sustained effect of simulation-based training on clinical performance 19 . The main take-home message of this study is that practice time itself cannot be used as a measure of competence when learning point of care ultrasound. This finding aligns with a learning curve study by Gustafsson et al., which found that the training time to reach plateau varied widely for 38 orthopedic surgery trainees practicing hip fracture surgery on a simulator with an average of 169 minutes with a 95% confidence interval (152–187 minutes) 20 . Similarly, a fixed number of performed procedures does not ensure proficiency as illustrated by Barsuk et al. when comparing resident physicians baseline simulated clinical skills (central venous catheter insertion, lumbar puncture, paracentesis, and thoracentesis) to their self-reported procedure experience 21 . However, the European Federation of Societies for Ultrasound in Medicine and Biology (EFSUMB) still propose a minimum number of scans as a training requirement 22 . Mastery learning is a break from previous training of physicians where acquisition and maintenance of clinical competence were based on clinical experience alone 23 . Mastery learning is an approach to competency-based education where trainees acquire knowledge and skills with fixed achievement standards, without limiting the time needed to reach proficiency. Importantly mastery learning results have little or no variation, whereas educational time can vary among learners. The concept of mastery learning requires a validated test with a credible pass/fail standard to assess competence 13 . It increases professional self-efficacy and translates into improved patient care practices and patient safety outcomes 24 . This study supports the concept that mastery learning is the optimal method to ensure competence, also when learning ultrasound. Newer guidelines on endoscopic ultrasound support this approach by recommending using validated assessment tools to ensure training is continued until a predefined level of competence is achieved. No arbitrary number of training procedures is mentioned 25 . As for the FAST protocol, as with other POCUS areas, there is currently no standardization of training and different models are used, including simulation training and live patient-based training. Each model has its own advantages and disadvantages 26 but many ultrasound courses are still arranged within a fixed timeframe, e. g., one day, or a fixed number of scans. Different trainees learn at different paces and a certain amount of time or a prespecified number of scans does not guarantee that sufficient competency has been achieved. It is therefore crucial to insist on mastery learning. This is very clear from the results of this study. Despite increasing test results in the group with the longest hands-on time, the fraction of trainees actually passing the test was quite low. Just as many in the other point of care ultrasound examinations, the result of a FAST examination could potentially have a great impact on the treatment of the patient. Therefore, the test itself was designed to discriminate between FAST novices and experienced users 12 . Since the trainees in this study did not have any ultrasound experience before the course, this clearly demonstrates that one short course, despite increasing amount of hands-on time, is not enough to learn how to use ultrasound in patient management including interpretation of clinical findings, since this requires more extensive training. The most important limitation of this study is the different group sizes. However, medical education research study sample sizes are often small, so the large number of participants could arguably make up for the unbalanced group sizes 27 . Another limitation is the lack of randomization, although the novices in the three groups are all very similar. They were all recruited from the same sites, they all lacked ultrasound experience, and they were all within their first year of post-graduate training. All training was conducted by the same team, and they all performed the same standardized test.

Conclusion

This study demonstrates that scanning performance when learning ultrasound increases with prolonged hands-on time but the interindividual variation among trainees is very large, thereby making it impossible to define the “optimal” time for hands-on training. This supports the use of the concept of mastery learning where each individual trainee must continue training until proficiency is reached.

Notice

This article was changed according to the following Erratum on July 6th 2022.

Erratum

In the above-mentioned article, the affiliations were indicated incorrectly, the corresponding author was changed, and on p. E5, one sentence was incorrect.
  25 in total

Review 1.  Focused abdominal sonogram for trauma: the learning curve of nonradiologist clinicians in detecting hemoperitoneum.

Authors:  S R Shackford; F B Rogers; T M Osler; M E Trabulsy; D W Clauss; D W Vane
Journal:  J Trauma       Date:  1999-04

Review 2.  Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis.

Authors:  David A Cook; Ryan Brydges; Benjamin Zendejas; Stanley J Hamstra; Rose Hatala
Journal:  Acad Med       Date:  2013-08       Impact factor: 6.893

Review 3.  Which factors are associated with trainees' confidence in performing obstetric and gynecological ultrasound examinations?

Authors:  M G Tolsgaard; M B Rasmussen; C Tappert; M Sundler; J L Sorensen; B Ottesen; C Ringsted; A Tabor
Journal:  Ultrasound Obstet Gynecol       Date:  2014-03-06       Impact factor: 7.299

Review 4.  Motor skill learning and performance: a review of influential factors.

Authors:  Gabriele Wulf; Charles Shea; Rebecca Lewthwaite
Journal:  Med Educ       Date:  2010-01       Impact factor: 6.251

Review 5.  International Federation for Emergency Medicine point of care ultrasound curriculum.

Authors:  Paul Atkinson; Justin Bowra; Mike Lambert; Hein Lamprecht; Vicki Noble; Bob Jarman
Journal:  CJEM       Date:  2015-03       Impact factor: 2.410

6.  Graduating Surgical Residents Lack Competence in Critical Care Ultrasound.

Authors:  Renuka Tripu; Margaret H Lauerman; Daniel Haase; Syeda Fatima; Jacob Glaser; Cassandra Cardarelli; Thomas M Scalea; Sarah Murthi
Journal:  J Surg Educ       Date:  2017-10-14       Impact factor: 2.891

7.  Can You Teach Yourself Point-of-care Ultrasound to a Level of Clinical Competency? Evaluation of a Self-directed Simulation-based Training Program.

Authors:  Fraser D Mackay; Felix Zhou; David Lewis; Jacqueline Fraser; Paul R Atkinson
Journal:  Cureus       Date:  2018-09-17

Review 8.  The role of strategies in motor learning.

Authors:  Jordan A Taylor; Richard B Ivry
Journal:  Ann N Y Acad Sci       Date:  2012-02-13       Impact factor: 5.691

9.  Sustained effect of simulation-based ultrasound training on clinical performance: a randomized trial.

Authors:  M G Tolsgaard; C Ringsted; E Dreisler; L N Nørgaard; J H Petersen; M E Madsen; N L C Freiesleben; J L Sørensen; A Tabor
Journal:  Ultrasound Obstet Gynecol       Date:  2015-08-06       Impact factor: 7.299

10.  Linking quality of care and training costs: cost-effectiveness in health professions education.

Authors:  Martin G Tolsgaard; Ann Tabor; Mette E Madsen; Camilla B Wulff; Liv Dyre; Charlotte Ringsted; Lone N Nørgaard
Journal:  Med Educ       Date:  2015-12       Impact factor: 6.251

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.