| Literature DB >> 35643045 |
Melissa F Tannenbaum1, Anuradha Shenoy-Bhangle2, Alexander Brook3, Seth Berkowitz4, Yu-Ming Chang5.
Abstract
RATIONALE ANDEntities:
Keywords: COVID-19; Diagnostic radiology; Resident education; Virtual learning; Virtual readouts
Mesh:
Year: 2022 PMID: 35643045 PMCID: PMC9123824 DOI: 10.1016/j.clinimag.2022.05.006
Source DB: PubMed Journal: Clin Imaging ISSN: 0899-7071 Impact factor: 2.420
Fig. 1Screen sharing (Starleaf). A. Multiple Starleaf sessions accessible through the institutional intranet. The virtual reading rooms were divided by service, with a up to 4 different rooms per service. B. After a participant selects a virtual reading room, a screen loads for the participants to enter his/her name as well as select the option to connect to webcam and microphone. C. Main screen of the virtual reading room. The menu bar at the top has several features including: list of the number of participants in the session, option to launch a text chat box, ability to share screen, webcam and microphone settings, and ability to record a session.
Fig. 2Control sharing (Collaborate). A. Menu bar on every PACS workstation has a link to the Collaborate platform. B. Once the participant clicks the link, a menu launches listing every radiology trainee and attending in the system. The participant chooses a single collaborator to send an invitation. C. The invitation populates on the collaborator's PACS station to be accepted. D. Once the invitation is accepted, a shared screen launches with the case on the original participant's screen. The shared screen contains all the PACS features and allows both participants to manipulate the images.
Virtual readout survey results.
| Question | n | % |
|---|---|---|
| Level of training | ||
| R1 | 8 | 13% |
| R2 | 7 | 11% |
| R3 | 6 | 9% |
| R4 | 5 | 8% |
| Fellow | 6 | 9% |
| Faculty | 32 | 50% |
| Total | 64 | |
| Gender | ||
| Female | 33 | 50% |
| Male | 28 | 42% |
| Prefer not to disclose | 5 | 8% |
| Total | 66 | |
| Level of satisfaction with virtual readout strategies | ||
| Screen sharing (Starleaf) | ||
| Did not use | 5 | 8% |
| Extremely satisfied | 15 | 23% |
| Very satisfied | 12 | 18% |
| Moderately satisfied | 19 | 29% |
| Slightly satisfied | 13 | 20% |
| Not at all satisfied | 2 | 3% |
| Total | 66 | |
| Control sharing (Collaborate) | ||
| Did not use | 7 | 11% |
| Extremely satisfied | 7 | 11% |
| Very satisfied | 9 | 14% |
| Moderately satisfied | 14 | 22% |
| Slightly satisfied | 22 | 34% |
| Not at all satisfied | 6 | 9% |
| Total | 66 | |
| Telephone | ||
| Did not use | 6 | 9% |
| Extremely satisfied | 5 | 8% |
| Very satisfied | 14 | 21% |
| Moderately satisfied | 18 | 27% |
| Slightly satisfied | 19 | 29% |
| Not at all satisfied | 4 | 6% |
| Total | 66 | |
| Learning as effective with virtual readout as in-person readouts | ||
| Screen sharing (Starleaf) | ||
| Did not use | 4 | 6% |
| Strongly agree | 11 | 17% |
| Agree | 20 | 30% |
| Neutral | 6 | 9% |
| Disagree | 17 | 26% |
| Strongly disagree | 8 | 12% |
| Total | 66 | |
| Control sharing (Collaborate) | ||
| Did not use | 6 | 9% |
| Strongly agree | 9 | 14% |
| Agree | 16 | 25% |
| Neutral | 8 | 12% |
| Disagree | 15 | 23% |
| Strongly disagree | 11 | 17% |
| Total | 66 | |
| Telephone | ||
| Did not use | 4 | 6% |
| Strongly agree | 4 | 6% |
| Agree | 11 | 17% |
| Neutral | 12 | 19% |
| Disagree | 24 | 37% |
| Strongly disagree | 10 | 15% |
| Total | 66 | |
| Technical difficulties (Ex. System crashes, trouble loading platform, slowing/freezing of software, etc.) | ||
| Screen sharing (Starleaf) | ||
| Did not use | 5 | 8% |
| Strongly agree | 4 | 6% |
| Agree | 7 | 11% |
| Neutral | 13 | 20% |
| Disagree | 22 | 33% |
| Strongly disagree | 15 | 23% |
| Total | 66 | |
| Control sharing (Collaborate) | ||
| Did not use | 9 | 14% |
| Strongly agree | 7 | 11% |
| Agree | 18 | 27% |
| Neutral | 9 | 14% |
| Disagree | 16 | 24% |
| Strongly disagree | 7 | 11% |
| Total | 66 | |
| Telephone | ||
| Did not use | 5 | 8% |
| Strongly agree | 1 | 2% |
| Agree | 5 | 8% |
| Neutral | 9 | 14% |
| Disagree | 27 | 42% |
| Strongly disagree | 18 | 28% |
| Total | 66 | |
| Do you agree that the ability for both users to manipulate the PACS images on Collaborate was important for learning/teaching? | ||
| Strongly agree | 24 | 45% |
| Agree | 14 | 26% |
| Neutral | 11 | 21% |
| Disagree | 2 | 4% |
| Strongly disagree | 2 | 4% |
| Total | 53 | |
| In the future, would you agree to support the use of some form of virtual read-out combined with in-person readouts? | ||
| Yes | 41 | 77% |
| No | 4 | 8% |
| Neutral | 8 | 15% |
| Total | 53 | |
| Support of these virtual programs in the future combined with in person readouts | ||
| Screen sharing (Starleaf) | ||
| Did not use | 3 | 6% |
| Strongly support | 25 | 48% |
| Support | 17 | 33% |
| Neutral | 3 | 6% |
| Do not support | 3 | 6% |
| Strongly do not support | 1 | 2% |
| Total | 52 | |
| Control sharing (Collaborate) | ||
| Did not use | 4 | 8% |
| Strongly support | 13 | 25% |
| Support | 16 | 31% |
| Neutral | 9 | 17% |
| Do not support | 8 | 15% |
| Strongly do not support | 2 | 4% |
| Total | 52 | |
| Telephone | ||
| Did not use | 1 | 2% |
| Strongly support | 11 | 21% |
| Support | 20 | 38% |
| Neutral | 8 | 15% |
| Do not support | 9 | 17% |
| Strongly do not support | 3 | 6% |
| Total | 52 | |
Mean responses by all participants.
| Variable | Mean | Variable comparisons | p-Value |
|---|---|---|---|
| Satisfaction | |||
| Screen sharing (Starleaf) | 3.41 | Starleaf vs. Collaborate | |
| Control sharing (Collaborate) | 2.81 | Starleaf vs. Telephone | |
| Telephone | 2.95 | Collaborate vs. Telephone | 0.9 |
| Perceived effectiveness | |||
| Screen sharing (Starleaf) | 3.14 | Starleaf vs. Collaborate | 0.5 |
| Control sharing (Collaborate) | 2.95 | Starleaf vs. Telephone | |
| Telephone | 2.59 | Collaborate vs. Telephone | |
| Experienced technical difficulties | |||
| Screen sharing (Starleaf) | 2.39 | Starleaf vs. Collaborate | |
| Control sharing (Collaborate) | 3.04 | Starleaf vs. Telephone | 0.2 |
| Telephone | 2.07 | Collaborate vs. Telephone | |
| Support of virtual platform in future | |||
| Screen sharing (Starleaf) | 4.26 | Starleaf vs. Collaborate | 0.07 |
| Control sharing (Collaborate) | 3.60 | Starleaf vs. Telephone | |
| Telephone | 2.54 | Collaborate vs. Telephone | 0.3 |
Bolded text indicates statistical significance
Means calculated from the following 5-point scale: 5 = Extremely Satisfied, 4 = Very satisfied, 3 = Moderately satisfied, 2 = Slightly satisfied, 1 = Not satisfied at all.
Means calculated from the following 5-point scale: 5 = Strongly agree, 4 = Agree, 3 = Neutral, 2 = Disagree, 1 = Strongly disagree.
Means calculated from the following 5-point scale: 5 = Strong support, 4 = Support, 3 = Neutral, 2 = Do not support, 1 = Strongly do not support.
Comparison of responses by trainees and attendings.
| Variable | Trainee (mean) | Attending (mean) | p-Value |
|---|---|---|---|
| Satisfaction | |||
| Screen sharing (Starleaf) | 3.34 | 3.43 | 0.83 |
| Control sharing (Collaborate) | 2.91 | 2.60 | 0.32 |
| Telephone | 2.68 | 3.18 | 0.07 |
| Perceived effectiveness | |||
| Screen sharing (Starleaf | 3.09 | 3.14 | 0.83 |
| Control sharing (Collaborate) | 2.81 | 3.04 | 0.54 |
| Telephone | 2.26 | 2.86 | |
| Experienced technical difficulties | |||
| Screen sharing (Starleaf | 2.38 | 2.46 | 0.72 |
| Control sharing (Collaborate) | 3.06 | 3.08 | 0.99 |
| Telephone | 2.13 | 2.04 | 0.71 |
| Ability for both users manipulate mouse on Collaborate aids in learning/teaching | 4.07 | 3.95 | 0.95 |
| Support the use of some form of virtual read-out combined with in-person readouts | 2.55 | 2.86 | 0.13 |
| Support of virtual platform in future combined with in-person readout | |||
| Screen sharing (Starleaf | 4.17 | 4.41 | 0.46 |
| Control sharing (Collaborate) | 3.55 | 3.69 | 0.54 |
| Telephone | 3.03 | 4.32 |
Bolded text indicates statistical significance
Means calculated from the following 5-point scale: 5 = Extremely Satisfied, 4 = Very satisfied, 3 = Moderately satisfied, 2 = Slightly satisfied, 1 = Not satisfied at all.
Means calculated from the following 5-point scale: 5 = Strongly agree, 4 = Agree, 3 = Neutral, 2 = Disagree, 1 = Strongly disagree.
Means calculated from the following 3-point scale: 3 = Agree, 2 = Neutral, 1 = Disagree.
Representative comments by trainees and attendings.
| Positive | Negative | |
|---|---|---|
| Screen sharing (Starleaf) | Able to review multiple studies at once without needing to start new session for each study. | Poorer resolution, harder to see smaller findings/suboptimal image quality. |
| Control sharing (Collaborate) | Easier to see images and navigate since it is on PACS. | Cannot concurrently use the shared mouse. |
| Telephone | Helpful for quick conversation. | Cannot share screen, which is okay for easier things, but makes showing findings more difficult. |