Ryan Joseph1, Jesse Fenton1, David Winchester2,3. 1. University of Florida College of Medicine, Gainesville, USA. 2. Division of Cardiovascular Medicine, Department of Medicine, College of Medicine, University of Florida, 1600 SW Archer Road, Box 100277, Gainesville, FL 32610-0277 USA. 3. Malcolm Randall Veterans Affairs Medical Center, Gainesville, USA.
Electrocardiography (ECG) is an important skill for medical practice. Typically, US medical schools incorporate didactics on ECG interpretation during the first 2 years of medical school for students. These ECGs tend to be simple diagnoses and unambiguous. They tend to differ from the more challenging ECGs that are routinely encountered in clinical care. Lectures tend to focus on the concepts behind ECGs with less time devoted to practice of ECG interpretation [1]. As a result, students tend to lack confidence in their ability to interpret ECGs in a systematic way [2]. Students get more exposure to ECG interpretation during clinical rotations. However, this exposure varies from student to student due to the random sample of patients that each student sees, the different pathologies they encounter, and the medical teams they work with (who may or may not provide bedside teaching on the ECG findings). It is no surprise that many medical students graduate without being able to reliably recognize urgent ECG findings [3].Prior studies have shown that a combination of lectures and online ECG practice with feedback leads to better ECG competence than lectures alone [4]. Curricula using ECG online modules or other web-based ECG delivery systems have shown success in improving the ECG comprehension and confidence of its users [5-7]. In fact, online teaching for medical students may be superior to in-person teaching both in terms of efficacy and the related costs/resources for teaching, depending on the content [8-10]. Typically, online curricula do not provide an outlet for users to ask questions or ask for clarification of difficult concepts in real time and on a regular basis with subject matter experts while using pre-recorded videos to explain the rationale to the solutions for assigned ECGs. The COVID-19 pandemic presented an additional challenge for students, with most schools removing “non-essential” staff from clinical settings. To provide students with a clinical learning opportunity during COVID-19, we created a 4-week, module-based ECG online course for 4th-year medical students. We designed it using a combination of pre-recorded videos and a live, online, weekly meeting with a faculty cardiologist. After the course was complete, we conducted a quasi-experimental analysis of the test scores and the course evaluations to determine the effectiveness of the course for enhancing student knowledge and confidence for interpreting ECGs.
Materials and Methods
This online, module-based course provided a self-paced ECG curriculum for a medical student elective. We implemented the 4-week course during November 2020 for twenty fourth-year medical students enrolled in the University of Florida College of Medicine. The course supplemented the basic ECG training received during their first-year didactics regarding ECG interpretation and typical findings for commonly tested cardiac conditions and any sporadic training on ECGs they might have received during 3rd-year clerkships. We distributed the course through the Canvas platform (Instructure, Salt Lake City, UT). Data on student performance were used retrospectively. We requested permission to use the results of the curriculum for a scholarly publication, which was reviewed by our institutional review board and the requirement for informed consent was waived.The course work required approximately 60 hours per student. We dedicated 8 hours in total toward the discussion component. The rest of the hours were independent study through weekly ECG assignments. The students start the course by taking a pre-test consisting of 10 ECGs tied to clinical scenarios. ECG tracings were selected to represent a wide variety of pathologies including ischemia/infarction, tachyarrhythmias, heart block/bradycardia, and electrolyte disturbances. ECGs used were a combination of content on the internet and deidentified tracing from our home institutions. We graded each of the 10 ECGs on a scale from 1 to 10 for a total of 100 points based on a predetermined scoring system awarding points for each element of the ECG. The pre-test also asked the students to enter their confidence in their ECG interpretation abilities on a 10-point scale. After the pre-test, the students watched three lecture videos that covered the basics of ECG interpretation (total ~ 1 hour). These videos gave an organized approach to reading ECGs as well as how to identify common ECG abnormalities (such as blocks, ischemia, hypertrophy, and heart rhythm irregularities). The students then completed 11 online modules by themselves. Each online module had 10–12 ECGs to interpret (with no clinical contexts given). Certain modules would emphasize abnormalities in specific parts of the ECG (such as ST changes or QRS abnormalities) while including unrelated ECGs to test their overall knowledge. The students reviewed and interpreted each ECG using a consistent method (rate, rhythm, axis, intervals, blocks, ischemia, other findings, and final interpretation). After the students submitted their interpretations, they would immediately be given the answers in text as well as links of a video explaining the correct findings for each ECG. The videos were created by our faculty cardiologist. At the end of each week, we met with the students via an online meeting platform to answer student questions and clarify difficulties on the assignments based on their interpretations. At the end of the 4 weeks, they took a final exam identical to the pre-test which is graded on the same 1 to 10 scale and 100-point total as the pre-test. The students were not informed that the pre- and post-test would have the same tracings. They were not given the correct interpretations after the pre-test to minimize the risk of using that information on the post-test. The final exam also asked the students to enter their confidence in their ECG interpretation abilities on a 10-point scale (with “1” signifying not confident and “10” signifying highly confident).Our curriculum requires a physician educator with a proficiency in ECG interpretation, an online content-management platform, and an online meeting platform. The students did not need to provide or obtain any other resources but were welcome to use outside study materials as they desired.We evaluated pre-test and final exam ECG interpretations via two separate teaching assistants given the same 1-to-10-point scale. The answers for every student were individually scored by each reviewer and then averaged. Disagreements were adjudicated by the course director. We compared the median scores (data were not normally distributed) between the pre-test and final exam using a Wilcox signed rank test. We also evaluated pre-test and final exam self-reported ECG interpretation confidence levels and compared them using a Wilcox signed rank test. Both analyses used alpha < 0.05 as the threshold for a significant finding. Analyses were conducted using SPSS version 21 (IBM, Armonk, NY). Formal sample selection was not performed as these were the measured results of an elective course taken by fourth-year students; no a priori power calculations were performed.
Results
Live sessions for the course were held via remote videoconferencing due to restrictions on in-person gathering during the COVID-19 pandemic. Attendance was not required or formally tracked, but was approximately 90% of the enrolled students for each session. The students voluntarily displayed a video and gave voice responses to questions as ECGs were reviewed by Dr. Winchester.Among the twenty student participants who took this class, the median score for the pre-test was 76.5. For the final exam, the median score increased to 87.5 (p < 0.0001). Eighteen students scored higher in the final exam compared to the pre-test; 11 students showed a 10-point or greater improvement in the final exam compared to the pre-test. Two students scored lower on the final exam; the largest decrease was 6 points. The median of the students’ self-confidence scores before the course was 3; after the course, the median score was 7.5 (p < 0.0001). Based on the end-of-course evaluations by the students, we found that the course was well liked as well as being effective. One student stated that “There is really nothing I would change about this course,” and another noted that “This has probably been one of the most useful courses I’ve taken in medical school.” No negative comments were provided; course assessments are anonymous, so we do not know if the students who had decreased scores on the final exam had a positive or negative experience with the course.
Discussion
The purpose of our course was to give a way for fourth-year medical students to strengthen their knowledge of ECGs and confidence with them in a self-directed, asynchronous model to strengthen their skills and confidence prior to residency. Based on knowledge scores, confidence scores, and the narrative comments, we found that both students’ ECG knowledge and confidence were enhanced by the course.Our results are similar to those of other courses using ECG online modules which have been effective at improving learners’ ECG competence. Other studies implemented online ECG curricula for pediatric emergency medicine fellows, pediatric residents, and internal medicine residents and showed statistically significant improvements in the learners’ ECG reading ability [5-7]. A study at Duke University School of Medicine has also shown significant improvement in ECG interpretation and confidence for its internal medicine clerkship students using an ECG module system compared to the prior year when it was not implemented [1]. Other studies using online-based ECG education have shown that they can be just as effective as, if not better than, traditional classroom-based methods of ECG instruction (even without the use of face-to-face interaction through an online meeting platform like our course utilized) [11, 12].In general, online education has been demonstrated to be effective with decreased costs and increased satisfaction compared to traditional classroom lecture methods [8]. Multiple studies have shown online medical education to be more effective than the traditional classroom learning model [9, 10, 13]. Online methods of teaching have been applied to other areas of medical education that require skill-based learning. For example, a virtual introductory radiology elective was created for medical students in response to the COVID-19 pandemic which proved to be effective in improving their radiology reading ability and confidence [14]. With all of this in mind, it would not be surprising if online teaching becomes more commonplace for medical education in the approaching decades.One limitation of our course is our inability to evaluate students above level 2 in Kirkpatrick’s Four Levels of Evaluation [15]. Measuring students’ ECG interpretation performance on clinical rotations after taking this course would be a method of evaluating level 3. However, many students were finished with their clinical rotations by the time they finished the course. In addition, it would have been unfeasible to do at the time given the COVID-19 pandemic. A likely future direction would be to spread this course to a larger population (such as to internal medicine residents or even to people in different countries through an open online course provider). Another limitation in our course is that the pre-test and final exam used the same questions, but to mitigate the chance of question recognition, they were spaced 4 weeks apart and we only returned their scores for the pre-test rather than revealing the correct answers. The course was an elective, so we cannot control for students who self-selected as high performers in ECG interpretation.In conclusion, our online curriculum effectively enhanced student skill and confidence for interpreting ECGs. The course could be easily replicated at other sites. If formal grades are required, then it could be difficult to scale the grading of pre- and post-tests. If used as a self-directed learning tool, it would be simple to spread to other groups of learners.
Conclusion
As a result of this course, the students showed a significant improvement in grades and self-confidence.
Authors: Kevin E O'Brien; Maria L Cannarozzi; Dario M Torre; Alex J Mechaber; Steven J Durning Journal: Teach Learn Med Date: 2009 Apr-Jun Impact factor: 2.414
Authors: Heather L Heiman; Toshiko Uchida; Craig Adams; John Butter; Elaine Cohen; Stephen D Persell; Paul Pribaz; William C McGaghie; Gary J Martin Journal: Med Teach Date: 2012-08-30 Impact factor: 3.650
Authors: Andrew J Klein; Mark Berlacher; Jesse A Doran; Jennifer Corbelli; Scott D Rothenberger; Kathryn Berlacher Journal: MedEdPORTAL Date: 2020-08-12