| Literature DB >> 36066086 |
Pau Xiberta1, Imma Boada1, Santiago Thió-Henestrosa2, Salvador Pedraza3, Víctor Pineda4.
Abstract
The risk of contagion and the lockdown caused by the COVID-19 pandemic forced a change in teaching methodologies in radiology. New knowledge about the disease that was being acquired on a daily basis needed to be rapidly spread worldwide, but the restrictions imposed made it difficult to share this information. This paper describes the methodology applied to design and launch a practice-based course on chest X-ray suggestive of COVID-19 right after the pandemic started, and aims to determine whether asynchronous online learning tools for radiology education are useful and acceptable to general practitioners and other medical personnel during a pandemic. The study was carried out from April to October 2020 and involved 2632 participants. Pre- and post-testing was used to assess the participants' gain of knowledge in the course content (paired t-tests and chi-squared tests of independence). A five-point Likert scale questionnaire inspired by the technological acceptance model (TAM) was provided to evaluate the e-learning methodology (ANOVA tests). The results from the pre- and post-tests showed that there were significant differences in the scores before and after completing the course (sample size = 2632, response rate = 56%, p<0.001). As for the questionnaire, all questions surpassed 4.5 out of 5, including those referring to perceived ease of use and perceived usefulness, and no significant differences were found between experienced and inexperienced participants (sample size = 2535, response rate = 53%, p=0.85). The analysis suggests that the applied methodology is flexible enough to adapt to complex situations, and is useful to improve knowledge on the subject of the course. Furthermore, a wide acceptance of the teaching methodology is confirmed for all technological profiles, pushing for and endorsing a more widespread use of online platforms in the domain of radiology continuing education.Entities:
Keywords: Asynchronous online e-learning; COVID-19 pandemic; chest X-ray (CXR); continuing medical education (CME); radiology training
Mesh:
Year: 2022 PMID: 36066086 PMCID: PMC9467610 DOI: 10.1080/10872981.2022.2118116
Source DB: PubMed Journal: Med Educ Online ISSN: 1087-2981
Topics of the course on chest radiology suggestive of COVID-19.
| Topic | Theory pages | Exercises |
|---|---|---|
| Presentation | 1 | 3 |
| Introduction to chest X-ray | 5 | 2 |
| Basics on chest X-ray of COVID-19 | 2 | 2 |
| Chest X-ray findings of COVID-19 | 15 | 12 |
| Indications and radiological monitoring | 2 | 3 |
| Standardised radiology report and score of severity stratification | 2 | 3 |
| Final Quiz | 1 | 3 |
Figure 1.Number of participants enrolled in the course per week.
Figure 2.The (a) first, (b) second and (c) third exercise used to perform the pre- and post-test for the evaluation of the acquired knowledge.
Participants’ questionnaire to evaluate the e-learning methodology.
| Age |
| How many online courses have you participated in, aside from this one? [0, 1 to 3, > 3] |
| Which device have you used preferentially to take the course? [Computer, Tablet, Others] |
| Do you think that the smartphone version of the platform is useful? [Yes, No] |
| (Q01) Globally, I favourably evaluate the course |
| (Q02) I would recommend this teaching methodology to my teammates |
| (Q03) It was easy for me to interact with images |
| (Q04) It was easy for me to access and navigate through the content pages |
| (Q05) It was easy for me to access and navigate through the exercises |
| (Q06) It was easy for me to identify each icon with its function |
| (Q07) The topics in which the course was structured are appropriate |
| (Q08) The course content met my expectations |
| (Q09) The balance between exercises and theory was appropriate |
| (Q10) Participating in this activity will allow me to improve elements of my daily work |
Figure 3.Distribution of participants by country of origin (the area of the bubble is proportional to the number of participants).
Number () and percentage (%) of participants within each of the analysed categories (only those who provided the information are considered).
| % | |||
|---|---|---|---|
| Professional category | Physician – Radiologist | 775 | 29.45% |
| Physician – Not Radiologist | 1215 | ||
| Other Healthcare Providers | 642 | 24.39% | |
| Age | < 24 | 81 | 4.64% |
| 24 to 29 | 381 | 21.83% | |
| 30 to 50 | 950 | ||
| > 50 | 333 | 19.08% | |
| Gender | Female | 1324 | |
| Male | 625 | 32.07% | |
| Region | Catalonia | 1923 | |
| Spain | 339 | 12.98% | |
| Other | 350 | 13.40% | |
| Device used to follow the course | Computer | 2169 | |
| Smartphone | 194 | 7.69% | |
| Tablet | 161 | 6.38% | |
| Online courses previously attended | None | 680 | 28.95% |
| Few (1 to 3) | 728 | 30.99% | |
| Many (> 3) | 941 |
Figure 4.Distribution of participants by (a) professional category, (b) device used to follow the course, and (c) number of online courses previously attended. For all charts, represents the number of participants who provided the corresponding information in the course registration form or in the evaluation questionnaire.
Comparison of independently evaluated pre- and post-test exercises.
| mean | sda | ||||
|---|---|---|---|---|---|
| Pre-test | 0.53 | 0.50 | 2632 | ||
| Post-test | 0.70 | 0.46 | 2632 | ||
| Pre-test | 0.81 | 0.39 | 2632 | ||
| Post-test | 0.91 | 0.29 | 2632 | ||
| Pre-test | 0.09 | 0.29 | 2632 | ||
| Post-test | 0.07 | 0.26 | 2632 |
a Standard deviation
b Number of participants
Comparison of pre- and post-test exercises considered as a whole.
| mean | sda | ||||
|---|---|---|---|---|---|
| All exercises | Pre-test | 1.44 | 0.71 | 2632 | |
| Post-test | 1.68 | 0.61 | 2632 |
a Standard deviation
b Number of participants
Scores of the post-test exercises by age.
| Post-test | Post-test | Post-test | ||||
|---|---|---|---|---|---|---|
| Age | Correct | Incorrect | Correct | Incorrect | Correct | Incorrect |
| < 24 | 47 | 34 | 71 | 10 | 3 | 78 |
| (58%) | (42%) | (87.7%) | (12.3%) | (3.7%) | (96.3%) | |
| 24 to 29 | 269 | 112 | 349 | 32 | 22 | 359 |
| (70.6%) | (29.4%) | (91.6%) | (8.4%) | (5.8%) | (94.2%) | |
| 30 to 50 | 708 | 242 | 876 | 74 | 74 | 876 |
| (74.5%) | (25.5%) | (92.2%) | (7.8%) | (7.8%) | (92.2%) | |
| > 50 | 225 | 108 | 294 | 39 | 25 | 308 |
| (67.6%) | (32.4%) | (88.3%) | (11.7%) | (7.5%) | (92.5%) | |
Scores of the post-test exercises by professional category.
| Post-test | Post-test | Post-test | ||||
|---|---|---|---|---|---|---|
| Professional category | Correct | Incorrect | Correct | Incorrect | Correct | Incorrect |
| Physician – Radiologist | 598 | 177 | 693 | 82 | 103 | 672 |
| (77.2%) | (22.8%) | (89.4%) | (10.6%) | (13.3%) | (86.7%) | |
| Physician – Not Radiologist | 884 | 331 | 1119 | 96 | 51 | 1164 |
| (72.8%) | (27.2%) | (92.1%) | (7.9%) | (4.2%) | (95.8%) | |
| Other Healthcare Providers | 361 | 281 | 580 | 62 | 32 | 610 |
| (56.2%) | (43.8%) | (90.3%) | (9.7%) | (5%) | (95%) | |
Scores of the post-test exercises by number of online courses previously attended.
| Post-test | Post-test | Post-test | ||||
|---|---|---|---|---|---|---|
| Number of online courses previously attended | Correct | Incorrect | Correct | Incorrect | Correct | Incorrect |
| None | 465 | 215 | 617 | 63 | 51 | 629 |
| (68.4%) | (31.6%) | (90.7%) | (9.3%) | (7.5%) | (92.5%) | |
| Few (1 to 3) | 517 | 211 | 658 | 70 | 61 | 667 |
| (71%) | (29%) | (90.4%) | (9.6%) | (8.4%) | (91.6%) | |
| Many (> 3) | 667 | 274 | 861 | 80 | 54 | 887 |
| (70.9%) | (29.1%) | (91.5%) | (8.5%) | (5.7%) | (94.3%) | |
Summary of the responses of the e-learning methodology evaluation questionnaire.
| Category | mean | sda | |
|---|---|---|---|
| Global evaluation | 4.62 | 0.60 | 2484 |
| Usability | 4.57 | 0.57 | 2470 |
| Content | 4.53 | 0.63 | 2460 |
a Standard deviation
b Number of participants
E-learning methodology evaluation by age.
| Global evaluation | Usability | Content | |||||||
|---|---|---|---|---|---|---|---|---|---|
| Age | mean | sda | mean | sda | mean | sda | |||
| < 24 | 4.58 | 0.61 | 78 | 4.52 | 0.52 | 80 | 4.48 | 0.61 | 81 |
| 24 to 29 | 4.57 | 0.64 | 379 | 4.59 | 0.56 | 376 | 4.53 | 0.62 | 373 |
| 30 to 50 | 4.66 | 0.53 | 938 | 4.62 | 0.50 | 936 | 4.57 | 0.59 | 928 |
| > 50 | 4.70 | 0.53 | 329 | 4.57 | 0.56 | 330 | 4.59 | 0.59 | 328 |
a Standard deviation
b Number of participants
E-learning methodology evaluation by professional category.
| Global evaluation | Usability | Content | |||||||
|---|---|---|---|---|---|---|---|---|---|
| Professional category | mean | sda | mean | sda | mean | sda | |||
| Physician – Radiologist | 4.57 | 0.64 | 610 | 4.51 | 0.62 | 674 | 4.46 | 0.69 | 600 |
| Physician – Not Radiologist | 4.63 | 0.58 | 1152 | 4.58 | 0.54 | 716 | 4.56 | 0.61 | 1149 |
| Other Healthcare Providers | 4.65 | 0.59 | 722 | 4.57 | 0.56 | 933 | 4.55 | 0.63 | 711 |
a Standard deviation
b Number of participants
E-learning methodology evaluation by number of online courses previously attended.
| Global evaluation | Usability | Content | |||||||
|---|---|---|---|---|---|---|---|---|---|
| Number of online courses previously attended | mean | sda | mean | sda | mean | sda | |||
| None | 4.61 | 0.63 | 673 | 4.55 | 0.61 | 606 | 4.52 | 0.67 | 661 |
| Few (1 to 3) | 4.63 | 0.58 | 723 | 4.59 | 0.55 | 1142 | 4.54 | 0.60 | 718 |
| Many (> 3) | 4.62 | 0.59 | 932 | 4.60 | 0.55 | 722 | 4.54 | 0.63 | 927 |
a Standard deviation
b Number of participants