| Literature DB >> 31064080 |
Mia Louise Østergaard1, Lars Konge2, Niklas Kahr3, Elisabeth Albrecht-Beste4, Michael Bachmann Nielsen5, Kristina Rue Nielsen6.
Abstract
Ultrasound exams need skilled examiners, and simulation-based training could provide standardized and safe skill training. This study aims to give an overview of different types of virtual-reality simulators for use in abdominal diagnostic ultrasound training in radiology. Fifteen specialized radiologists and radiological residents were presented with two similar cases on four different simulators for abdominal ultrasound training. A feedback sheet for each individual simulator and for an overall comparison was filled out by all participants. All means of scores were compared, and simulators were ranked from least to most favorable. One simulator was ranked most favorable in seven out of nine questions, but none of the mean scores had statistically significant differences. All simulators were recommended for training radiologists, and all simulators were perceived to benefit trainees more than experienced ultra-sonographers.Entities:
Keywords: abdominal; education; radiology; simulation; ultrasound; virtual-reality
Year: 2019 PMID: 31064080 PMCID: PMC6627565 DOI: 10.3390/diagnostics9020050
Source DB: PubMed Journal: Diagnostics (Basel) ISSN: 2075-4418
Demographics.
| Demographics | Radiological Residents | Radiologists | All |
|---|---|---|---|
|
| 7 | 8 | 15 |
| Female | 3 | 2 | 5 |
| Male | 4 | 6 | 10 |
|
| 13.9 (17.5) | 1093 (491.7) | - |
Figure 1Simulators in the Study’s Setting.
The Simulators.
| Simulator | MedaPhor | Schallware | Simbionix | CAE Healthcare |
|---|---|---|---|---|
|
| A | B | C | D |
|
| ScanTrainer | Ultrasound Simulator | U/S Mentor | CAE Vimedix |
|
| Cardiff, UK | Berlin, Germany | Tel Aviv, Israel | Sarasota, Florida, US |
|
| A hard drive, two screens, a floor-mounted haptic device, keyboard and mouse with rollerball. One screen displays image, buttons and help features, and the other screen displays the virtual patient and probe | A hard drive, two touch screens, keyboard with a rollerball, probes and a sensor table with a mannequin torso. One screen mimics ultrasound buttons and one shows the scan image | A hard drive, one touch screen with screen divisions, keyboard, probes and a sensor table with a mannequin torso. The split screen displays the scan image on one side, and buttons plus anatomical helping model on the other side | A hard drive, one split-screen, keyboard, mouse, probes and a sensor table with a mannequin torso. The screen displays the scan image, buttons and help features |
|
| €20,000 to €90,000 | €20,000 to €65,000 | 50,0000 to 100,000 USD | Not disclosed |
|
| About 250 | About 250 | 8 diagnostic and 11 FAST | 10 cases |
|
| Yes, customers can upload their own scans from patients | Yes, with additional equipment | All cases can be customized for severity and/or presence of pathology | All cases can be customized for severity and/or presence of pathology |
|
| Real patient CT scans blended with computer data | Real patient ultrasound b-mode scans | Computer generated data | Computer generated data |
|
| ScanTutor will test user against set metrics. Metrics can be customized. Full diagnostic list for each case | Region of interest (RoI) with the option of turning RoIs into questions/answers. Full diagnostic list for each case | Case severity feature/multiple scenarios pr. case. Skill tasks. Full diagnostic list for each case | Performance assessments with kinematic metrics. Full diagnostic list for each case |
|
| Computer animated doppler incl. CFM, PW & CW. Breathing patient | Doppler pre-recorded | Computer animated Doppler | Computer animated Doppler |
|
| OBGYN | Obstetrics/Gynecology | Basic sonography skills | Obstetrics/Gynecology |
Simulator information, obtained from intelligentultrasoundsimulation.com, schallware.de, simbionix.com, and cae.com.
Comparative Questionnaires.
| Overall Ranking | MedaPhor | Schallware | Simbionix | ViMedix |
|---|---|---|---|---|
|
|
|
|
|
|
| Residents/Experienced |
|
|
|
|
|
|
|
|
|
|
| Residents/Experienced.tif |
|
|
|
|
|
|
|
|
|
|
| Residents/Experienced |
|
|
|
|
|
|
|
|
|
|
| Residents/Experienced |
|
|
|
|
|
|
|
|
|
|
| Residents/Experienced.tif |
|
|
|
|
|
|
|
|
| |
| Residents/Experienced |
|
|
|
|
|
|
|
|
|
|
| Residents/Experienced |
|
|
|
|
|
|
|
|
|
|
| Residents/Experienced |
|
|
|
|
|
|
|
|
|
|
| Residents/Experienced |
|
|
|
|
Overall ranking with first, second, third, and fourth place, represented by green, yellow, orange, and red smileys, respectively. The collective ranking is shown (large smiley) with the ranking divided by groups underneath (small smileys). If ranking scores are identical, two simulators share the respective placing, e.g., two simulators share first place in feedback, and no second placing is listed.
Comparing Mean Scores by Groups.
| Comparing Mean Score by Groups | Radiological Residents | Radiologists | |
|---|---|---|---|
|
| |||
| Educational Value (SD) | 3.1 (1.2) | 2.9 (1.2) | >0.05 |
| Fidelity (SD) | 2.9 (0.9) | 3.3 (1.3) | >0.05 |
| Usability (SD) | 3.1 (0.9) | 3.5 (1.3) | >0.05 |
| Overall Satisfaction (SD) | 2.9 (1.4) | 2.9 (1.6) | >0.05 |
|
| |||
| Educational Value (SD) | 4.0 (0.6) | 3.8 (0.8) | >0.05 |
| Fidelity (SD) | 3.8 (0.8) | 4.1 (0.5) | >0.05 |
| Usability (SD) | 4.1 (0.8) | 3.9 (0.9) | >0.05 |
| Overall Satisfaction (SD) | 3.8 (1.0) | 3.9 (0.7) | >0.05 |
|
| |||
| Educational Value (SD) | 3.4 (1.3) | 1.3 (1.0) | >0.05 |
| Fidelity (SD) | 3.2 (0.8) | 3.2 (0.9) | >0.05 |
| Usability (SD) | 3.8 (0.7) | 4.0 (0.7) | >0.05 |
| Overall Satisfaction (SD) | 3.8 (1.0) | 3.3 (0.9) | >0.05 |
|
| |||
| Educational Value (SD) | 3.2 (1.1) | 2.9 (1.0) | >0.05 |
| Fidelity (SD) | 2.6 (0.7) | 3.0 (1.1) | >0.05 |
| Usability (SD) | 3.5 (0.5) | 3.2 (1.5) | >0.05 |
| Overall Satisfaction (SD) | 3.2 (1.3) | 3.0 (1.2) | >0.05 |
Means for both groups combined on the individual questionnaires. The values refer to the Likert’s scale of 1 (least favorable answer) to 5 (most favorable answer).
Comparing Means for All Scores Combined.
| Comparing Means | Simulator A | Simulator B | Simulator C | Simulator D |
|---|---|---|---|---|
|
| ||||
| Benefits for novices | 3.8 (1.2) | 4.7 (0.5) | 4.4 (0.7) | 4.2 (0.9) |
| Benefits for intermediates | 3.1 (1.5) | 4.3 (1.0) | 3.3 (1.6) | 2.9 (1.5) |
| Benefits for advanced | 2.1 (1.1) | 2.7 (1.2) | 2.2 (1.3) | 2.0 (1.1) |
|
| ||||
| Scans resemble real life images | 3.5 (1.2) | 4.8 (0.4) | 2.5 (1.2) | 2.5 (1.2) |
| Probe resemble real life use | 2.5 (1.5) | 3.7 (1.1) | 4.0 (1.2) | 3.9 (1.0) |
| Knobs resemble real life use | 3.1 (1.6) | 3.5 (1.3) | 3.2 (1.4) | 2.1 (1.2) |
|
| ||||
| Intuitive use | 3.7 (1.3) | 4.4 (0.9) | 4.5 (0.6) | 4.2 (1.3) |
| Useful instructions | 3.1 (1.2) | 3.9 (1.1) | 3.9 (0.9) | 3.1 (1.4) |
| Useful feedback | 3.1 (1.1) | 3.7 (1.0) | 3.3 (1.1) | 2.7 (1.2) |
|
| ||||
| Easy to use | 3.3 (1.5) | 4.3 (0.9) | 4.5 (0.6) | 4.1 (1.5) |
| Overall feel | 2.9 (1.6) | 3.7 (1.0) | 3.7 (1.2) | 3.0 (1.5) |
| Recommend it to department | 2.4 (1.4) | 3.6 (1.1) | 2.6 (1.5) | 2.3 (1.3) |
Means compared using collective scores for all participants from the individual simulator questionnaires. The values refer to the Likert’s scale of 1 (least favorable answer) to 5 (most favorable answer).