| Literature DB >> 28959176 |
Amal Alsalamah1, Rudi Campo2, Vasilios Tanos3, Gregoris Grimbizis4, Yves Van Belle2, Kerenza Hood5, Neil Pugh6, Nazar Amso1.
Abstract
BACKGROUND: Ultrasonography is a first-line imaging in the investigation of women's irregular bleeding and other gynaecological pathologies, e.g. ovarian cysts and early pregnancy problems. However, teaching ultrasound, especially transvaginal scanning, remains a challenge for health professionals. New technology such as simulation may potentially facilitate and expedite the process of learning ultrasound. Simulation may prove to be realistic, very close to real patient scanning experience for the sonographer and objectively able to assist the development of basic skills such as image manipulation, hand-eye coordination and examination technique.Entities:
Keywords: Medical education; ScanTrainer; Transvaginal ultrasonography; Ultrasound; Validation; Virtual reality simulation
Year: 2017 PMID: 28959176 PMCID: PMC5596038 DOI: 10.1186/s10397-017-1020-6
Source DB: PubMed Journal: Gynecol Surg ISSN: 1613-2076
Fig. 1Ultrasound simulator ScanTrainer consists of (1) a monitor which represents learning contents as programmed by specific learning software, and the monitor connects to (2) a haptic device, (3) mouse and (4) keyboard
Participants’ demographics and ultrasonography experience
| Non-expert | Expert | |
|---|---|---|
| No. of participants ( | 25 (69%) | 11 (31%) |
| Gender | ||
| Female | 17 (68%) | 7 (64%) |
| Male | 8 (32%) | 4 (36%) |
| Country of practice | ||
| Within UK | 6 (24%) | 3 (27%) |
| Outside UK | 19 (75%) | 8 (73%) |
| Speciality | ||
| Consultant | – | 3 (27%) |
| Obs/Gyn specialist | 2 (8%) | 4 (36%) |
| Specialist trainee | 20 (80%) | 3 (27%) |
| Medical student | 1 (4%) | – |
| Radiographer | – | 1 (10%) |
| Other (midwives) | 2 (8%) | – |
| Median age | 31 (25–39) | 51 (32–67) |
| Years of ultrasound experience | ||
| Never | 3 (12%) | – |
| < 6 months | 5 (20%) | – |
| 6–11 months | 9 (36%) | – |
| 1–2 years | 6 (24%) | 1 (10%) |
| > 2 years | 2 (8%) | 10 (90%) |
| Transvaginal ultrasound experience | ||
| Independent practitioner | 2 (8%) | 11 (100%) |
| Trainee under supervision | 23 (92%) | – |
| Ultrasound sessions | ||
| Never | 4 (16%) | – |
| Daily | 1 (4%) | 5 (46%) |
| Once/week | 9 (36%) | – |
| Once/month | 3 (12%) | 2 (18%) |
| Occasionally | 5 (20%) | 2 (18%) |
| Other | 3 (12%) | 2 (18%) |
| Previous experience with the ScanTrainer® | ||
| Yes | 3 (12%) | 3 (27%) |
| No | 22 (88%) | 8 (73%) |
| Previous experience with ultrasound model, i.e. blue Phantom™ | ||
| Yes | 4 (16%) | 4 (36%) |
| No | 21 (84%) | 7 (64%) |
Face validity ‘median scores’ ratings by experts and non-experts (n = 36)
| Median score (range) | ||||
|---|---|---|---|---|
| Face validity statements | Expert ( | Non-expert ( | Overall |
|
| Statement 1: Relevance of the simulator for actual transvaginal ultrasound scanning | 7.5 (5.0–10) | 9.0 (7.0–10) | 8.7 (5.0–10) | 0.1 |
| Statement 2: Realism of the simulator to simulate the transvaginal scan of female pelvis | 8.3 (5.0–10) | 8.0 (5.9–10) | 8.1 (5.0–10) | 0.9 |
| Statement 3: Realism of the simulator to simulate the movements possibly required to perform in the female pelvic anatomy (uterus, ovaries/adnexa, POD) | 7.7 (1.0–10) | 9.0 (5.0–10) | 9.0 (1.0–10) | 0.1 |
| Statement 4: Realism of the ultrasound image generated during the performance | 9.0 (1.3–9.8) | 9.0 (6.0–10) | 9.0 (1.3–10) | 0.2 |
| Statement 5: Force feedback provided on the operator’s hand to simulate real scan | 7.5 (3.0–9.5) | 7.5 (2.7–10) | 7.5 (2.7–10) | 0.4 |
| Statement 6: Realism of simulator to provide actual action of all buttons provided in the control panel | 9.0 (1.0–10) | 8.7 (3.0–10) | 9.0 (1.0–10) | 0.5 |
| General statements | ||||
| Statement 13: Overall value of the simulator as a training tool | 9.0 (5.0–10) | 9.3 (6.0–10) | 9.0 (5.0–10) | 0.2 |
| Statement 14: Overall value of the simulator as a testing tool | 9.0 (5.0–10) | 9.5 (5.6–10) | 9.3 (5.0–10) | 0.2 |
Fig. 2Box plots represented the median, first and third quartiles, minimum, maximum and outliers of scores obtained by expert and non-expert ratings of the six face validity statements. Dots (outliers) represented those experts who scored lower than others and the number referred to participant’s code number in data analysis and that did not relate to score value
Fig. 3Box plots represented the median, first and third quartiles, minimum, maximum and outliers of scores obtained by expert and non-expert ratings of the two general validity statements on the simulator as training and testing tool. Dots (outliers) represented those experts who scored lower than others and the number referred to participant’s code number in data analysis and that did not relate to score value
Content validity ‘median scores’ ratings by experts (n = 11)
| Content validity statements | Expert median (range) |
|---|---|
| Statement 7: Realism of the simulator to provide the endometrial thickness measurement in gynaecology task | 8.6 (3.5–10) |
| Statement 8: Realism of the simulator to provide measurements of the ovary in gynaecology task | 8.7 (4.5–10) |
| Statement 9: Ability to test normal gynaecological anatomy: uterus, adnexa and Pouch of Douglas | 8.4 (4.7–10) |
| Statement 10: Ability to test early pregnancy structures: fetus, viability and placenta | 9.0 (5.0–10) |
| Statement 11: Realism of the simulator to provide the CRL measurement in early pregnancy task | 9.0 (4.7–10) |
| Statement 12: Relevance of the simulator’s learning resource, videos and ScanTutor function | 8.7 (5.0–10) |
Fig. 4Box plots represented the median, first and third quartiles, minimum, maximum and outliers of scores obtained by expert and non-expert ratings of the six content validity statements. Dots (outliers) represented those experts who scored lower than others and the number referred to participant’s code number in data analysis and that did not relate to score value