BACKGROUND: Implementation of a point of care ultrasound curricula is valuable, but optimal integration for internal medicine residency is unclear. The purpose of this study was to evaluate if a structured ultrasound curriculum vs. structured ultrasound curriculum plus supervised thoracic ultrasounds would improve internal medicine residents' skill and retention 6 and 12 months from baseline. METHODS: We conducted a randomized controlled study evaluating internal medical residents' skill retention of thoracic ultrasound using a structured curriculum (control, n = 14) vs. structured curriculum plus 20 supervised bedside thoracic ultrasounds (intervention, n = 14). We used a stratified randomization based on program year. All subjects attended a half-day course that included 5 lectures and hands-on sessions at baseline. Assessments included written and practical exams at baseline, immediately post-course and at 6 and 12 months. Scores are reported as a percentage for the number of correct responses/number of questions (range 0-100%). The Mann Whitney U and the Friedman tests were used for analyses. RESULTS:Twenty-eight residents were enrolled. Two subjects withdrew prior to the 6-month exams. Written exam scores for all subjects improved, baseline median (IQR) 60 (46.47 to 66.67) post-course 80 (65 to 86.67), 6-month 80 (66.67 to 86.67) and 12-month 86.67 (80 to 88.34), p = <0.001. All subjects practical exam scores median (IQR) significantly improved, baseline 18.18 (7.95 to 32.95), post-course 59.09 (45.45 to 70.45), 6 month 71.74 (60.87 to 82.61) and 12-month 76.09 (65.22 to 88.05), p = <0.001. Comparing the control group to the intervention group, there were statistically significant higher scores, median (IQR), in the intervention group on the practical exam at 6 months 63.05 (48.92 to 69.57) vs. 82.61(72.83 to89.13), p = <0.001. CONCLUSION: In this cohort, internal medicine residents participating in astructured thoracic ultrasound course plus 20-supervised ultrasounds achieved higher practical exam scores long-term compared to controls.
RCT Entities:
BACKGROUND: Implementation of a point of care ultrasound curricula is valuable, but optimal integration for internal medicine residency is unclear. The purpose of this study was to evaluate if a structured ultrasound curriculum vs. structured ultrasound curriculum plus supervised thoracic ultrasounds would improve internal medicine residents' skill and retention 6 and 12 months from baseline. METHODS: We conducted a randomized controlled study evaluating internal medical residents' skill retention of thoracic ultrasound using a structured curriculum (control, n = 14) vs. structured curriculum plus 20 supervised bedside thoracic ultrasounds (intervention, n = 14). We used a stratified randomization based on program year. All subjects attended a half-day course that included 5 lectures and hands-on sessions at baseline. Assessments included written and practical exams at baseline, immediately post-course and at 6 and 12 months. Scores are reported as a percentage for the number of correct responses/number of questions (range 0-100%). The Mann Whitney U and the Friedman tests were used for analyses. RESULTS: Twenty-eight residents were enrolled. Two subjects withdrew prior to the 6-month exams. Written exam scores for all subjects improved, baseline median (IQR) 60 (46.47 to 66.67) post-course 80 (65 to 86.67), 6-month 80 (66.67 to 86.67) and 12-month 86.67 (80 to 88.34), p = <0.001. All subjects practical exam scores median (IQR) significantly improved, baseline 18.18 (7.95 to 32.95), post-course 59.09 (45.45 to 70.45), 6 month 71.74 (60.87 to 82.61) and 12-month 76.09 (65.22 to 88.05), p = <0.001. Comparing the control group to the intervention group, there were statistically significant higher scores, median (IQR), in the intervention group on the practical exam at 6 months 63.05 (48.92 to 69.57) vs. 82.61(72.83 to89.13), p = <0.001. CONCLUSION: In this cohort, internal medicine residents participating in a structured thoracic ultrasound course plus 20-supervised ultrasounds achieved higher practical exam scores long-term compared to controls.
Educators agree on the value of implementation of point of care ultrasound into Internal Medicine curricula is valuable, but optimal integration to sustains skills in point of care ultrasound (POCUS) curricula into an Internal Medicine Residency is unclear [1, 2]. The utility of an ultrasound workshop combined with a designated curriculum has proven beneficial in minimizing attrition rate, but did not address the practical acquisition and interpretation of images [3, 4]. Longitudinal ultrasound curricula has been shown to increase knowledge retention and improve skill acquisition in internal medicine (IM) residents, but skill acquisition retention has not been fully quantified [5]. IM physicians were able to accurately assess left systolic function using focused cardiac ultrasound with a rigorous training program in one study [6]. The purpose of this study was to evaluate thoracic ultrasound skill retention comparing internal medicine (IM) residents completing a standard ultrasound curriculum versus a standard curriculum plus obtaining 20 supervised bedside thoracic ultrasounds.
Materials and methods
We conducted a randomized controlled study from June 2017 to June 2018. All post-graduate year (PGY) -1 and PGY-2 residents attending the ultrasound curriculum were invited to participate. Categorical PGY-1 and PGY-2 IM residents who attended the one-day ultrasound training course and signed the informed consent form were included. Participants were randomized to either the standard ultrasound curriculum (control group) or standard ultrasound curriculum plus obtaining 20 supervised thoracic ultrasounds at the bedside (intervention group). We used a stratified randomization based on program year. To determine group allocation, subjects were assigned using Excel random numbers.All residents participated in the standard internal medicine residency ultrasound curriculum that included:A half-day comprehensive course at the beginning of the study comprised of vascular, cardiac and thoracic ultrasound lectures. All lectures were one hour. Pathology image review and hands-on scanning of models (3:1 student to faculty ratio), included one hour of scanning for thoracic ultrasound. All course instructors, for the comprehensive course and teaching sessions completed throughout the academic year, were either internal medicine faculty, critical care fellows, critical care attending physicians or ultrasound fellowship trained emergency medicine physicians.Five one-hour lectures, on physics, abdominal, vascular, cardiac, and thoracic ultrasound spanned throughout the academic year, andFive concordant (within the same week as the lecture) one hour hands-on sessions, based on the lecture topics, using standardized patients at our Simulation Center.The intervention group was required to complete 20 additional supervised ultrasound scans of a hemi-thorax on hospitalized patients admitted to the general medical floors or the medical-surgical intensive care unit (ICU). Participants completed 10 scans between baseline and the 6-month assessment and 10 additional scans between 6 months and the 12-month assessment. Three study investigators (NT, LM, CN) proficient in thoracic ultrasound supervised the scans. All the essential parts of a comprehensive thoracic ultrasound were reviewed with the subject in the intervention group by one of the three study investigators (NT, LM, CN) and feedback was provided to the participating resident in real-time. Proficiency of the instructors was determined by following the Canadian recommendations for critical care ultrasound training and competency in thoracic ultrasound [7].The control group participants could perform scans per their own discretion. Data was not captured for non-study related scans. One of three ultrasound machines (Sonosite M-Turbo, Bothell, WA, Sonosite X-porte, Bothell, WA, Mindray North America Mahwah, NJ) were used, depending on machine availability.Assessment for both the intervention and control groups occurred at four intervals: immediately before the half-day ultrasound course at the beginning of the study, immediately after the half-day course, at 6 months, and at 12 months. Assessment consisted of both a written test and a practical test designed by the study investigators (NP, LM, KC, CN).In this study, competency with POCUS is consistent with a conceptual framework that includes assessment of the acquisition and application of POCUS related knowledge, demonstration of technical skills, and effective integration into clinical practice [8]. The written and practical evaluations were developed based on the International Thoracic Ultrasound Guidelines and the American College of Chest Physicians Guidelines [9-11]. Prior to the study start, the assessment was reviewed and edited by 10 volunteers, including a novice (< 1 year experience), those proficient in thoracic ultrasound (> 3 years experience) and clinicians advanced in thoracic ultrasound (regional and national faculty for POCUS). The written assessment included 15 multiple-choice questions. Eight questions sought a diagnosis based on a projected still image or short video clip, five questions sought management understanding based on a projected still image or clip, and two questions were assessing general ultrasound knowledge without corresponding images. (S1 Fig) The practical assessment was five minutes in length and required the participant to demonstrate 23 different competencies in thoracic ultrasound imaging. (Table 1) We chose these ultrasound skills, as they are clinically applicable for internists [9].
Table 1
Thoracic ultrasound competencies.
Image acquisition and identification of anatomy
1. Identify muscle wall
2. Identify pleural line
3. Identify pleural sliding
4. Identify rib shadows
5. Identify muscle wall in M mode
6. Identify pleural line in M mode
7. Identify pleural sliding in M mode
8. Identify a-lines
9. Identify b-lines if present
10. Identify z-lines
11. Identify lung pulse if present
12. Identify presence or absence of lung point
13. Identify diaphragm
14. Identify liver
15. Identify spleen
16. Identify caudad vs cephalad
Examine enough sites to rule out pneumothorax and pleural effusion on each side of the thorax
17. Evaluate 4 sites on each side for lung sliding
18. Evaluate for pleural effusion posteriorly bilaterally
Image Optimization: Adjust image to assess pleura/lungs
19. Adjust gain
20. Adjust depth
21. Orient probe marker
22. Operator faces machine
23. Correct transducer setting is chosen on ultrasound machine
We completed an analysis over time, examining the baseline, immediate posttest, at 6 months and 12 months. In order to determine which statistical test to run, we ran a Shapiro Wilke test and test of skewness, which indicated that some of these data were not normally distributed. In an effort to make the statistical testing consistent, we ran all repeated measures non-parametric testing using the Friedman test. An analysis of separate test scores performed via the Mann Whitney U test. Results are presented as median (interquartile range, IQR). Scores are reported as a percentage of the number of correct responses/number of questions (range 0–100%). We completed a Bonferroni correction in order to account for the 14 tests that were run. This created a significant level of p < = 0.004. The Cooper Institutional Review Board reviewed and approved the study.
Results
Thirty-six residents were invited to participate. Fourteen PGY-1 and fourteen PGY-2 residents were enrolled in May 2017. The intervention group included 11 male and 3 female residents, six PGY-1, and eight PGY-2. The control group included 7 male and 7 female residents, eight PGY-1 and six PGY-2. Twenty-six of the 28 participants completed the study. In the control group, one of the PGY-2 residents voluntarily withdrew consent from the study and another was unable to complete the 6- and 12-month follow-up. All of the intervention group participants completed the required supervised scans at 6 and 12 months.There was a significant difference in written and practical exam scores over time in both the intervention (n = 14, p = < 0.001) and control (n = 12, p = <0.001 groups from baseline to 12 months. (Fig 1) Written exam scores from baseline to 12 months, increased by approximately 26.67 points and the practical exam scores increased by approximately 53.53 points.
Fig 1
Exam scores over time for all participants.
When comparing the intervention group to the control group, no significant differences were observed for any of the four written assessments. (Table 2) Counts of correct responses for the individual written exam questions are available in the S1 Table. The baseline and immediately post-course practical assessment scores were similar between the two groups. A significant difference in the median (IQR) practical assessment score was observed between the two groups at 6 months 82.61 (72.83–89.13) intervention vs. 63.05 (48.92–69.57) control, = p = <0.001). (Table 3)
Table 2
Written exam scores.
Control
Intervention
P-value
95% Confidence Interval
N
Median (IQR)
N
Median (IQR)
Pre-course score
14
63.33 (53.33–66.67)
14
53.33 (46.67–68.33)
0.38
-13.33 to 6.67
Post-course score
14
80 (60–86.67)
14
76.67 (63.33–81.67)
0.91
-13.33 to 13.33
6 month score
12
73.33 (66.67–80)
14
83.34 (70–86.67)
0.18
-6.66 to 20
12 month score
12
83.34 (80–86.67)
14
86.67 (80–93.33)
0.30
0 to 13.33
IQR = Interquartile range
Table 3
Practical assessment scores.
Control
Intervention
P-value
95% Confidence Interval
N
Median (IQR)
N
Median (IQR)
Pre-course score
14
15.91 (9.09–28.41)
14
20.46 (4.55–36.36)
0.70
-9.09 to 13.64
Post-course score
14
59.09 (48.86–78.41)
14
54.55 (39.77–68.18)
0.33
-22.72 to 9.09
6 month score
12
63.05 (48.92–69.57)
14
86.36 (76.14–93.18)
< 0.001
13.04 to 34.78
12 month score
12
65.22 (47.83–81.52)
14
76.09 (65.22–88.05)
0.12
-4.35 to 26.08
IQR = Interquartile range
IQR = Interquartile rangeIQR = Interquartile range
Discussion
Our competency based medical education study used a learner driven process of direct observations by supervisors to achieve practical goals. This model allowed both the teachers and the students to take responsibility for achieving success. An alternative yearlong lecture based model, even with handheld ultrasound device (HUD), showed no improvement in knowledge or image interpretation score [12].In our study, both the control and intervention groups were able to perform non-study related scans, during the study period. Non-study related scans were not counted or recorded. The intervention group achieved competency in thoracic ultrasound through investigator-supervised scans as opposed to the control group, which relied on the traditional didactic educational model.This single center study demonstrated that a longitudinal practical curriculum improves internal medicine residents’ long-term image acquisition proficiency. Our results are similar to Town et al. who showed that residents who participated in hands-on assessments demonstrated improvement in their ultrasound skills (performing 2-point compression to assess for deep vein thrombosis, identification of internal jugular vein and inferior vena cava) over the course of a year [13].The practical assessment scores for the intervention group decreased from the 6-month assessment to the 12-month assessment, indicating that there may have been a decline in knowledge retention. There were no significant differences in the 6 month nor 12 month scores in the assessment (written, p = 0.08; practical, p = 0.123). The 13% drop in the intervention group’s practical score is less than prior reports of up to 29% of physicians not being able to replicate their POCUS skillset at one-year after graduating an internal medicine residency [14]. This finding is likely due to the intervention group continuing study participation through one year in our study.We identified several study limitations. We did not record the number of ultrasounds completed by the control group during the study period. It is unknown if the group performed any scans during the study period, which may have affected the study results. The randomization process did not equally distribute the PGY-1 and PGY-2 residents between the control and intervention groups. Yet it is unknown if this uneven distribution affected study results. Study investigators were the supervisors for the required 20 scans performed by the intervention group. Two investigators (LM and NP) supervised the majority (83.9%) of the proctored scans and performed the majority of standardized assessments. The investigators served as the models and scorers for the practical assessment scans completed by the participants in the intervention group. Different ultrasound machines were used for the practical assessment depending on which one of the three was available, potentially resulting in participants having varying degrees of familiarity with the machines. Two participants in the control group did not complete the entire standard ultrasound curriculum due withdraw in one and personal commitments in another.We provided examination dates in advance, possibly allowing the subjects to study prior to the examination. Yet, we did not deter subjects from studying at any time during the yearlong study. Use of the same written assessment exam at each time point may have contributed to the improvement in test scores. However, the correct answers were not provided to the subjects and all tests were numbered and accounted for to ensure that no exams were missing.External experts in ultrasound education reviewed the written and practical assessment exams, but multiple choice and practical questions were not validated prior to study start [15, 16]. Additional reliability and validation studies are need to establish a standardized method to evaluate POCUS images [17].
Conclusions
In this study, an ultrasound curriculum improved internal medicine residents’ written knowledge of thoracic point of care ultrasound. The addition of 20 required supervised thoracic ultrasound scans to a standard lecture-based and hands-on ultrasound curriculum improved internal medicine residents’ practical thoracic ultrasound assessment scores at 6 months compared to the control group. Proctored scanning is associated with an observed difference in practical skills overtime. Future studies should include validation of assessment tools, structured time points for the additional supervised scans, and standard procedures for designing and scoring practical assessment.
Written exam questions.
(DOCX)Click here for additional data file.
Counts of correct responses for the individual written exam questions.
(DOCX)Click here for additional data file.6 Aug 2020PONE-D-20-21036Skill Retention with Ultrasound CurriculaPLOS ONEDear Dr. Puri,Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.Please submit your revised manuscript by Sep 20 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.Please include the following items when submitting your revised manuscript:A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocolsWe look forward to receiving your revised manuscript.Kind regards,Ezio Lanza, M.D.Academic EditorPLOS ONEJournal Requirements:When submitting your revision, we need you to address these additional requirements.1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found athttps://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf andhttps://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdfReview Comments to the Author:Reviewer #1: the authors have tried to investigate skill retention in ultrasound training; this is a laudable exercise, since not much evidence exists in this field; unfortunately, there is no information regarding the number of ultrasound scans that were performed by the control group. It is even possible that they have performed no scans after the training; detailed information about the number of scans in both groups is necessary information.Reviewer #2: This paper presents randomized data of a 20 proctored lung ultrasound curriculum compared to the standard POCUS curriculum that is typical. I think the design is simple and sufficient for its intended goals. I think where to paper suffers is mostly placing this study in the broader context of IM curriculum POCUS studies that have been done. This is a minor but important point as I think properly compared to existing literature it will make this work more meaningful.The statistical methods need to be better described (see below)Line 71: Probably should include these citationshttps://pubmed.ncbi.nlm.nih.gov/31125075/https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6638610/Line 116: How does your assessment relate to this framework? It does seem like it encompasses the domains mentioned (Acquisition of knowledge, Application of obtained knowledge, Demonstration of technical competence, Integration into clinical practice, Re-certification – the 5th may not apply) but might explicitly state it does so.https://pubmed.ncbi.nlm.nih.gov/30924088/Box 1: Typo in anatomy #5 – “in”Line 131: Methods: Would like the statistical considerations more clear.For repeated measures ANOVA this was presumably to compare the same study participants results over time. And for the independent T tests this was mean tests scores between groups. Would make this more explicit. Was there any correction for multiple corrections?Line 171: “Our competency based medical education study, used a learner” – no comma neededLine 186-187: Needs citation and the relative % drop should be compared to other studiesLine 198: The assessment used was not validated as mentioned. This is unfortunate especially as validated tools do already exist. Besides the citation listed also see https://pubmed.ncbi.nlm.nih.gov/22124000/Line 199: “Use of the same written assessment exam at each time point may have contributed to the improvement in test scores.” This needs to be further highlighted as a limitation and actually may invalidate the results over time have any bearing. I don’t see how the scores increased from immediately after the intervention to 6 and 12 mo. There is no statistical significance between groups so it does not change the interpretation of results but was a flaw in the design. (It is a well-established phenomenon on repeat testing i.e. MOCA).Line 202. The conclusions are wanting. I think the key finding is that proctored scanning is the key to to observed noticeable difference in practical skill over time is important.I think I would put it in context to this recent RCT (https://pubmed.ncbi.nlm.nih.gov/32118565/) that was surprising that machine access did not change retention. The major dogma has been access to trained faculty and machines have been the “barriers”. In this study the direct faculty involvement seemed to have a large impact that cannot be replaced by independent practice.Reviewer #3: Thank you for the opportunity to review this manuscript. Ultrasound curriculum for internal medicine is an important topic, the authors describe the value of additional scanning for learning thoracic ultrasound. The manuscript is well written and clearly presented. However, the results are only described briefly and original data are lacking. Also there is a high risk of bias. I do have some major concerns.IntroductionLines 70-72 lack referencesMethodsLine 79 How many residents were invited to participate?Line 81 what are the inclusion criteria?Line 82: how were they randomized?Line 88: please describe the duration of the thoracic lecture and hands on scanning in the comprehensive course?Line 99: please describe the protocol for the supervised ultrasound scans in relation the exams scan.Line 101: when did the residents perform the scans? Which intervals? Or only immediately prior to the assessment?Line 103: why were scans reviewed as they were directly supervised?Line 105: proficiency of whom?Line 109: did the control group performed any scans? How many? Who supervised them?Line 113: were assessment dates known in advance? It is well known that students may practice more and study prior to an examLine 115-126: please add the exam questions and scoring system in the manuscript. The reader is unable to interpret the results without knowing the scoring system and range.Line 115: was assessment of exams equally distributed between the 4 study investigators? For both groups? How did you manage to uniform assessment between assessors?ResultsLine 137: a baseline table is lacking with demographics and especially prior ultrasound experienceTable: it would be interesting to provide an additional table with the mean scores for every item, this will tell why the scores are different.Please provide confidence intervals in tables. Have you tested for normal distribution of scores? Is mean applicable? There is a rather large SD.DiscussionLine 173-175: I don’t understand the phrase “at their own pace”. If I understand correctly, both groups were able to perform ultrasound during the study period, but the intervention group made an additional 20 supervised scanning.Line 184: did you compare scores within groups over time? Is the difference for the intervention group for 6 and 12 months significant?Line 187: reference is lacking for statementLine190: how did the difference in randomization influence your results as you mention this as a limitation?Line 193: were exam scans equally distributed between study investigators? And for both groups?Line 196: how many is some? In which groups? What part did they not complete?Line 197: what do mean with external experts? And did they review the exams taken by the intervention group or only during development of the exams?Line 199: was the practical exam also performed the same at each time point? And how did they compare to the additional 20 exams?How many ultrasound did participants perform during the study period (beside the study protocol)? And for which applications?[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.22 Sep 2020Please see word filed uploaded.Response to reviewers1. Reviewer #1: the authors have tried to investigate skill retention in ultrasound training; this is a laudable exercise, since not much evidence exists in this field; unfortunately, there is no information regarding the number of ultrasound scans that were performed by the control group. It is even possible that they have performed no scans after the training; detailed information about the number of scans in both groups is necessary information.Author’s response: We are grateful for your review and thoughtful critiques. The protocol did not include the number of scans completed by the control group, as it was not part of our standardized curricula. Both groups were able to perform scans outside of the study protocol. The protocol attempted to reflect usual training versus the intervention of 20 supervised scans. The house staff performs most of their diagnostic ultrasound training during their ICU rotations. Their exposure is dependent on patient population, faculty interest and their own interest. We included the information in the discussion and will consider these important points for future studies.Reviewer #2: This paper presents randomized data of a 20-proctored lung ultrasound curriculum compared to the standard POCUS curriculum that is typical. I think the design is simple and sufficient for its intended goals. I think where to paper suffers is mostly placing this study in the broader context of IM curriculum POCUS studies that have been done. This is a minor but important point as I think properly compared to existing literature it will make this work more meaningful.Author’s response: Placing our study in the broader context of previous literature is important. We revised the introduction to reflect this point.2. The statistical methods need to be better described (see below)Author’s response: We have revised the methods to provide additional detail.3. Line 71: Probably should include these citationshttps://pubmed.ncbi.nlm.nih.gov/31125075/ Mellor TE, Junga Z, Ordway S, et al. Not Just Hocus POCUS: Implementation of a Point of Care Ultrasound Curriculum for Internal Medicine Trainees at a Large Residency Program. Mil Med. 2019;184(11-12):901-906. doi:10.1093/milmed/usz124https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6638610/ LoPresti CM, Schnobrich DJ, Dversdal RK, Schembri F. A road map for point-of-care ultrasound training in internal medicine residency. Ultrasound J. 2019;11(1):10. Published 2019 May 9. doi:10.1186/s13089-019-0124-9Author’s response: We have added the recommended references.4. Line 116: How does your assessment relate to this framework? It does seem like it encompasses the domains mentioned (Acquisition of knowledge, Application of obtained knowledge, Demonstration of technical competence, Integration into clinical practice, Re-certification – the 5th may not apply) but might explicitly state it does so.https://pubmed.ncbi.nlm.nih.gov/30924088/ Kumar A, Kugler J, Jensen T. Evaluation of Trainee Competency with Point-of-Care Ultrasonography (POCUS): a Conceptual Framework and Review of Existing Assessments. J Gen Intern Med. 2019;34(6):1025-1031. doi:10.1007/s11606-019-04945-4Author’s response: The conceptual framework and the Kumar et al reference was added to the manuscript.5. Box 1: Typo in anatomy #5 – “in”Author’s response: We corrected the typo.6. Line 131: Methods: Would like the statistical considerations more clear.Author’s response: We provided clarification of the statistical methods.7. For repeated measures ANOVA this was presumably to compare the same study participants results over time. And for the independent T tests this was mean tests scores between groups. Would make this more explicit. Was there any correction for multiple corrections?Author’s response: We have revised the statistical methods to provide clarity.Statistical Methods:We completed an analysis over time, examining the baseline, immediate posttest, at 6 months and 12 months. In order to determine which statistical test to run, we ran a Shapiro Wilke test and test of skewness, which indicated that some of the data were not normally distributed. In an effort to make the statistical testing consistent, we ran all repeated measures non-parametric testing using the Friedman test. An analysis of separate test scores performed via the Mann Whitney U test. Results are presented as median (interquartile range). We completed a Bonferroni correction in order to account for the 14 tests that were run. This created a significant level of p <=0.004.8. Line 171: “Our competency based medical education study, used a learner” – no comma neededAuthor’s response: We made the correction.9. Line 186-187: Needs citation and the relative % drop should be compared to other studiesAuthor’s response: We had added a citation Kimura BJ, Sliman SM, Waalen J, Amundson SA, Shaw DJ. Retention of ultrasound skills and training in “point-of-care” cardiac ultrasound. Journal of the American Society of Echocardiography. 2016 Oct 1;29(10):992-7 for a study that evaluated knowledge retention of cardiac ultrasound at 1 year. Since we did not collect data when our subjects were out of the study, it is not exactly comparing “like” groups to each other, but it does quantify the amount of knowledge decay in medicine residents who undergo POCUS education.10. Line 198: The assessment used was not validated as mentioned. This is unfortunate especially as validated tools do already exist. Besides the citation listed also see https://pubmed.ncbi.nlm.nih.gov/22124000/Author’s response: We appreciate the reviewer’s comments and suggested citations. We were unaware of the Bhaner et al. study prior to preparing the design for this study. The Bahner et al 2011 reference has been added to the manuscript.11. Line 199: “Use of the same written assessment exam at each time point may have contributed to the improvement in test scores.” This needs to be further highlighted as a limitation and actually may invalidate the results over time have any bearing. I don’t see how the scores increased from immediately after the intervention to 6 and 12 mo. There is no statistical significance between groups so it does not change the interpretation of results but was a flaw in the design. (It is a well-established phenomenon on repeat testing i.e. MOCA).Author’s response: We included the limitation of using a repeated exam for this study. In our study, we were aiming to assess knowledge retention. We did not provide residents with the answers after the exam and all exams were numbered and returned to ensure no exams were missing.12. Line 202. The conclusions are wanting. I think the key finding is that proctored scanning is the key to observed noticeable difference in practical skill over time is important.Author’s response: We agree that based on the results of this study, proctored scanning resulted in an observed difference in practical skills over time. We have articulated this in the conclusion.13. I think I would put it in context to this recent RCT (https://pubmed.ncbi.nlm.nih.gov/32118565/) that was surprising that machine access did not change retention. The major dogma has been access to trained faculty and machines have been the “barriers”. In this study the direct faculty involvement seemed to have a large impact that cannot be replaced by independent practice.Author’s response: We appreciate this feedback and added the 2020 citation - Kumar A, Weng Y, Wang L, et al. Portable Ultrasound Device Usage and Learning Outcomes Among Internal Medicine Trainees: A Parallel-Group Randomized Trial [published online ahead of print, 2020 Feb 11]. J Hosp Med. 2020;15(2):e1-e6. doi:10.12788/jhm.3351. The Kumar study provided IM physicians with a handheld ultrasound device with lectures over one year. This study did not include proctored scan acquisition with feedback. In our study of IM residents, we believe that a combination of lecture, hands on experience and observed image acquisition with real-time feedback provided a good framework for knowledge and skill retention. We included these points in the discussion.Reviewer #3: Thank you for the opportunity to review this manuscript. Ultrasound curriculum for internal medicine is an important topic, the authors describe the value of additional scanning for learning thoracic ultrasound. The manuscript is well written and clearly presented. However, the results are only described briefly and original data are lacking. Also there is a high risk of bias. I do have some major concerns.Author’s response: We appreciate the reviewers comments. We have provided additional detail in the methods and results section of the paper.Introduction14. Lines 70-72 lack referencesAuthor’s response: Two references (Mellor et al and Lopresti et al) have been added to the introduction.Methods15. Line 79 How many residents were invited to participate?Author’s response: 36 residents were invited to participate in the study. We included this information in the results section of the manuscript.16. Line 81 what are the inclusion criteria?Author’s response: Subjects were included if they were 1. a PGY1 or PGY2 resident , 2. provided informed consent and 3. attended the one-day ultrasound training course. We included this information in the manuscript.17. Line 82: how were they randomized?Author’s response: There was a stratified randomization completed by PGY. Excel random numbers were used to determine which subject was assigned to which group.Author’s response:18. Line 88: please describe the duration of the thoracic lecture and hands on scanning in the comprehensive course?Author’s response: The methods have been modified to reflect the one- hour scanning for thoracic ultrasound and a one- hour thoracic lecture was provided.19. Line 99: please describe the protocol for the supervised ultrasound scans in relation the exams scan.Author’s response: The manuscript has been revised to reflect that the resident completed a comprehensive thoracic and the study investigator provided feedback in real-time. Ten supervised scans were completed prior to the 6-month exam and another 10 prior to the 12-month exam. No predefined time interval was expected for exam completion, meaning that a subject could do more than one scan at a given time point to achieve their goal of 10 supervised scans in 6 months.20. Line 101: when did the residents perform the scans? Which intervals? Or only immediately prior to the assessment?Author’s response: No predefined interval existed for when the scans were done, yet they were not done in bulk prior to the examination. Ten supervised scans were completed prior to the 6-month exam and another 10 prior to the 12-month exam.21. Line 103: why were scans reviewed as they were directly supervised?Author’s response: We believed that by directly supervising the exam, the subjects learned more from the dedicated investigator and real-time feedback improved skill retention. All of the investigators did not review the scans. We clarified this sentence in the manuscript.22. Line 105: proficiency of whom?Author’s response: The proficiency of the instructors. The manuscript has been revised to reflect the proficiency of the instructors.. Line 109: did the control group performed any scans? How many? Who supervised them?Author’s response: We did not quantify the number of exams completed by the control group. A sentence has been included to indicate that both groups were able to perform scans unrelated to the study. Since we did not monitor scans performed by the control group, we are unable to comment on the number of scans or who if anyone supervised the subjects in the control group.23. Line 113: were assessment dates known in advance? It is well known that students may practice more and study prior to an examAuthor’s response: We appreciate these comments. We provided assessment dates in advance to assist with adherence to the study timelines. It is unknown if subjects practiced or studied prior to the reassessment. There were no limitations placed on students studying prior to an exam. In our opinion, additional studying would not be viewed negatively as this would be helpful to the residents.24. Line 115-126: please add the exam questions and scoring system in the manuscript. The reader is unable to interpret the results without knowing the scoring system and range.Author’s response: The exam questions and scoring system have been added to the manuscript as supplemental information. Of note, during the preparation of this table, we recognized that a written and practical score at 6 months in the control group and two practical scores at 12 months should not have been included in the analyses. We made the correction, reanalyzed all data and modified the manuscript as appropriate.25. Line 115: was assessment of exams equally distributed between the 4 study investigators? For both groups? How did you manage to uniform assessment between assessors?Author’s response: Assessment of the scans was not equally distributed as the scans were completed on varying days and hours throughout the study period. Observation of scans was done based on availability of the investigator. The investigators agreed on a uniform assessment prior to study start.Results26. Line 137: a baseline table is lacking with demographics and especially prior ultrasound experience.Authors response: We have included sex of the participants to the PGY year, as no demographic data was captured from the subjects. We are unable to report on prior ultrasound experience, as we did not capture this information.27. Table: it would be interesting to provide an additional table with the mean scores for every item, this will tell why the scores are different.Authors response: A table with the number of correct scores for each question has been added to the supplemental material.Please provide confidence intervals in tables. Have you tested for normal distribution of scores? Is mean applicable? There is a rather large SD.Author’s response: The tables have been revised to include 95% confidence intervals. An updated analysis examining medians (IQRs) instead of means is included. This change is reflected in the results section and within the tables.Discussion28. Line 173-175: I don’t understand the phrase “at their own pace”. If I understand correctly, both groups were able to perform ultrasound during the study period, but the intervention group made an additional 20 supervised scanning.Author’s response: The manuscript has been modified to state that both groups were able to perform scans outside of the study requirements. We removed “at their own pace” and addressed how scans were performed prior to 6 months and 12 months in the methods section of the paper.29. Line 184: did you compare scores within groups over time?Author’s response: We compared the groups over time and data is available in figure 1.30. Is the difference for the intervention group for 6 and 12 months significant?Authors’s response – No. For the written exam, p = 0.08 and for the practical exam the p = 0.055. We added text to the manuscript “There were not significant differences in the 6 month nor 12 month scores in the assessment (written, p = 0.08; practical, p = 0.123)”. We have added this information to the text.31. Line 187: reference is lacking for statementAuthor’s response: A reference has been added along with context.32. Line190: how did the difference in randomization influence your results as you mention this as a limitation?Author’s response: The randomization scheme was developed for 36 potential participants stratified based on program year. Since only 28 of 36 residents provided consent, the control group and intervention were not equal in regards to PGY-1 and PGY-2 residents. We do not believe this influenced our results, yet we felt it was important to share the differences within the groups.33. Line 193: were exam scans equally distributed between study investigators? And for both groups?Author’s response: The scans were not equally distributed among study investigators. The manuscript has been updated to reflect this information.34. Line 196: how many is some? In which groups? What part did they not complete?Author’s response: One subject in the control group withdrew consent from the study and another was unable to complete the 6 and 12 month follow-up due to personal commitments. All subjects in the intervention group completed all exams required over the course of the year. The manuscript has been revised to reflect this information.35. Line 197: what do mean with external experts? And did they review the exams taken by the intervention group or only during development of the exams?Author’s response: Since we did not use validated questions, we used a variety of expert physicians to beta test the written and practical examination. The external experts were regional and or national leaders in the practice of POCUS. They only reviewed the development of the exams.36. Line 199: was the practical exam also performed the same at each time point? And how did they compare to the additional 20 exams?Author’s response: The practical exam was performed at the same time point as the written exam and it served as the basis for the 20 additional exams in the intervention group.37. How many ultrasound did participants perform during the study period (beside the study protocol)? And for which applications?Author’s response: We did not track ultrasounds performed outside of the study protocol. Therefore, we are unable to provide information for this query.Submitted filename: response to reviewers 1.docxClick here for additional data file.19 Oct 2020PONE-D-20-21036R1Skill Retention with Ultrasound CurriculaPLOS ONEDear Dr. Puri,Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.Please submit your revised manuscript by Dec 03 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.Please include the following items when submitting your revised manuscript:A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocolsWe look forward to receiving your revised manuscript.Kind regards,Ezio Lanza, M.D.Academic EditorPLOS ONEReviewer #2: On re-review with the changes incorporated the Authors have made clear the design, findings and limitations and have better incorporated existing literature for comparison. My only comments now are stylistic:Minor suggestionsLine 29, 74: ”valuable, but optimal…”Line 32: from baseline instead of after baselineLine 34: This sentence reads awkwardlyLine 80: How does this tie to skill retentionLine 194: I might say “An alternative yearlong lecture based model , even with personal HUD, showed no improvement in knowledge or image interpretation score.”Reviewer #3: The authors have addressed my concerns. However two issues remain.A major limitation is that it is unknown how many scans both groups performed during the study besides the study protcol. This study supports that proctored scanning helps, but the amount of the effect in unknown. If control group made no scans than 20 additonal scan is useful in increasing skills. If both groups made an additional 30 ultrasound, the most important explanation is that supervized scanning with bedside feedback is most important, not the amount.it is still unclear to me how the practical assessment scores are computed. The supplement shows 23 points to be earned, but scores in tables are up to 86?This needs to be explainedto be able to interpret the conclusions11 Nov 2020Response to ReviewersReviewer #2: On re-review with the changes incorporated the Authors have made clear the design, findings and limitations and have better incorporated existing literature for comparison. My only comments now are stylistic:Minor suggestionsLine 29, 74: ”valuable, but optimal…”Line 32: from baseline instead of after baselineLine 34: This sentence reads awkwardlyLine 80: How does this tie to skill retentionLine 194: I might say “An alternative yearlong lecture based model , even with personal HUD, showed no improvement in knowledge or image interpretation score.”Authors’ response: Thank you for your kind remarks, detailed review of the revision and stylistic suggestions. We have incorporated all of the suggestions in the revised manuscript.Reviewer #3: The authors have addressed my concerns. However two issues remain.A major limitation is that it is unknown how many scans both groups performed during the study besides the study protcol. This study supports that proctored scanning helps, but the amount of the effect in unknown. If control group made no scans than 20 additonal scan is useful in increasing skills. If both groups made an additional 30 ultrasound, the most important explanation is that supervized scanning with bedside feedback is most important, not the amount.Authors’ response: We appreciate the reviewers concerns and have previously included this limitation in our previous manuscript. We did not consider tracking non-study related POCUS exams in either group as part of the protocol. Based on previous literature in this area, we also noted that Mellor et al, did not capture scans performed in non-study related activities and Kelm et al did not account for ultrasound exposure during the follow-up period, listing this as a limitation. However, we do appreciate the reviewers concerns and will consider this limitation in our future work in this area.it is still unclear to me how the practical assessment scores are computed. The supplement shows 23 points to be earned, but scores in tables are up to 86?This needs to be explainedto be able to interpret the conclusionsAuthors’ response: The scores are reported as percentages, range 0-100% based on the number correct responses/the number of questions. We included a sentence in the abstract and the methods section of the paper to add clarification. Scores are reported as a percentage of the number of correct responses/number of questions (range 0 - 100%).Submitted filename: Response to Reviewers R2 10.21.2020.docxClick here for additional data file.16 Nov 2020Skill Retention with Ultrasound CurriculaPONE-D-20-21036R2Dear Dr. Puri,We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.Kind regards,Ezio Lanza, M.D.Academic EditorPLOS ONE20 Nov 2020PONE-D-20-21036R2Skill retention with ultrasound curriculaDear Dr. Puri:I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.If we can help with anything else, please email us at plosone@plos.org.Thank you for submitting your work to PLOS ONE and supporting open access.Kind regards,PLOS ONE Editorial Office Staffon behalf ofDr. Ezio LanzaAcademic EditorPLOS ONE
Authors: Giovanni Volpicelli; Mahmoud Elbarbary; Michael Blaivas; Daniel A Lichtenstein; Gebhard Mathis; Andrew W Kirkpatrick; Lawrence Melniker; Luna Gargani; Vicki E Noble; Gabriele Via; Anthony Dean; James W Tsung; Gino Soldati; Roberto Copetti; Belaid Bouhemad; Angelika Reissig; Eustachio Agricola; Jean-Jacques Rouby; Charlotte Arbelot; Andrew Liteplo; Ashot Sargsyan; Fernando Silva; Richard Hoppmann; Raoul Breitkreutz; Armin Seibel; Luca Neri; Enrico Storti; Tomislav Petrovic Journal: Intensive Care Med Date: 2012-03-06 Impact factor: 17.440
Authors: Benjamin K Johnson; David M Tierney; Terry K Rosborough; Kevin M Harris; Marc C Newell Journal: J Clin Ultrasound Date: 2015-07-14 Impact factor: 0.910
Authors: Bruce J Kimura; Sean M Sliman; Jill Waalen; Stan A Amundson; David J Shaw Journal: J Am Soc Echocardiogr Date: 2016-06-29 Impact factor: 5.251
Authors: Thomas E Mellor; Zachary Junga; Sarah Ordway; Timothy Hunter; William T Shimeall; Sarah Krajnik; Lisa Tibbs; Jeffrey Mikita; Joseph Zeman; Paul Clark Journal: Mil Med Date: 2019-12-01 Impact factor: 1.437
Authors: Irene W Y Ma; Shane Arishenkoff; Jeffrey Wiseman; Janeve Desy; Jonathan Ailon; Leslie Martin; Mirek Otremba; Samantha Halman; Patrick Willemot; Marcus Blouw Journal: J Gen Intern Med Date: 2017-05-11 Impact factor: 5.128
Authors: Scott J Millington; Robert T Arntfield; Robert Jie Guo; Seth Koenig; Pierre Kory; Vicki Noble; Haney Mallemat; Jordan R Schoenherr Journal: Crit Ultrasound J Date: 2017-11-22
Authors: Michael P Boniface; Scott A Helgeson; Jed C Cowdell; Leslie V Simon; Brett T Hiroto; Monia E Werlang; Sarah W Robison; Grace G Edwards; Michele D Lewis; Michael J Maniaci Journal: Adv Med Educ Pract Date: 2019-11-04
Authors: Catherine A Moore; Daniel W Ross; Kurtis A Pivert; Valerie J Lang; Stephen M Sozio; W Charles O'Neill Journal: Clin J Am Soc Nephrol Date: 2022-09-21 Impact factor: 10.614