Literature DB >> 35509352

Evaluation of the Utility of Online Objective Structured Clinical Examination Conducted During the COVID-19 Pandemic.

Mona Arekat1, Mohamed Hany Shehata2,3, Abdelhalim Deifalla4,5, Ahmed Al-Ansari6, Archana Kumar6, Mohamed Alsenbesy1,7, Hamdi Alshenawi8, Amgad El-Agroudy1, Mariwan Husni9,10, Diaa Rizk11, Abdelaziz Elamin12, Afif Ben Salah2, Hani Atwa6,13.   

Abstract

Background: The COVID-19 pandemic led to profound restrictions on the face-to-face learning and assessment in all educational institutions, particularly the medical schools. The College of Medicine and Medical Sciences of the Arabian Gulf University (CMMS-AGU) conducted the final exams, both theoretical and clinical components, for its MD students online. This study was conducted to evaluate the utility of online clinical exams held at CMMS-AGU.
Methods: This is a cross-sectional, mixed method study that included samples from final year medical students, examiners, and heads of clinical departments. Data were collected through surveys, structured interviews, documents' review, and calculation of online examination's psychometrics. Descriptive statistics were used. Quantitative data were presented in the form of means and standard deviations. Responses of heads of clinical departments in the structured interview were transcribed and analyzed thematically based on three pre-established themes.
Results: Quantitative and qualitative data on the utility (validity, reliability, acceptability, educational impact, and cost and feasibility) of online objective structured clinical examination (OSCE) were collected. Content validity of the online clinical examination was established through high mean scores of content representativeness, which was confirmed by the heads of clinical departments regarding the proper coverage of clinical skills. Criterion validity was established through a high correlation between clinical and theoretical exam results (r = 0.75). Reliability of the exam was established through an acceptable Cronbach's alpha value (0.70 to 0.78) over the four days of the examinations. The examinations were perceived as highly acceptable by both students and examiners. High educational impact was inferred from students' responses and review of documents. The examination was found to be feasible and of reasonable cost.
Conclusion: Online OSCE might be a good alternative of conventional clinical assessments in times of crises and impossibility of having in-person contact between students, examiners, and patients. An important major drawback is still present in such initiatives, which is the inability to assess students' physical examination skills.
© 2022 Arekat et al.

Entities:  

Keywords:  COVID-19; OSCE; exam utility; online clinical assessment

Year:  2022        PMID: 35509352      PMCID: PMC9060808          DOI: 10.2147/AMEP.S357229

Source DB:  PubMed          Journal:  Adv Med Educ Pract        ISSN: 1179-7258


Introduction

The coronavirus disease 2019 (COVID-19) outburst swiftly transformed into an ongoing global pandemic, exerting profound restrictions on the face-to-face learning in all educational institutions, particularly the medical schools.1 Medical training was most disturbed because of the intrinsic nature of the program that requires working in teams, close contact with patients, and inevitable communication with patients and their families.2,3 Not only was the medical training disturbed but the clinical examinations were also disrupted. Medical schools were unsure whether to conduct the final year clinical examinations for clerkship students or defer it until the COVID-19 pandemic resolves. It soon became apparent that the pandemic situation would continue for an indefinite duration and many medical schools around the world made a bold decision to graduate their senior students with unconventional exams.4,5 This was also necessitated by the crucial need to speed up the process of graduation of more doctors and other health workers to bridge the imminent deficiencies in the health-care sector to handle the ongoing pandemic.4–7 This motivated the College of Medicine and Medical Sciences, Arabian Gulf University (CMMS-AGU) to reach the verdict of conducting online exams for their final year students.8 The theoretical component of the clinical courses was organized and distantly proctored through an online student assessment platform. For the clinical component, it was decided to implement ten online objective structured clinical examination (OSCE) stations. Zoom Meetings® platform was used due to its favorable features, including ease of use, Breakout Rooms feature, and Waiting Rooms feature. This experience was documented as a toolbox for conducting online OSCE in a publication by Shehata et al.9 In online clinical examinations, the real challenge is to redesign it to match the pedagogy of student assessment with keeping in mind important elements like secure identity, academic integrity, capacity building, as well as other fundamental utility features of student assessment, namely validity, reliability, acceptability, educational impact, and feasibility.10–12 Although there are few studies that capture the nuances of planning and implementation of online OSCE examination, there is very scarce information available on the utility of such exams. This study aims at evaluating the utility of online OSCE exams held at CMMS-AGU in terms of their validity, reliability, educational impact, acceptability, and cost and feasibility of these exams.

Subjects and Methods

Study Design

A cross-sectional, mixed method study conducted during June to August 2020. Triangulation was used to obtain different but complementary data on the same topic from different stakeholders.

Study Setting

The 6-year program at the CMMS-AGU is divided into 3 phases: Phase 1 (Year 1; Basic Sciences), Phase 2 (Years 2 to 4; Pre-clerkship Phase of Basic Medical Sciences), and Phase 3 (Years 5 and 6; Clerkship Phase of Clinical Sciences). At the end of Year 6, students sit for a comprehensive exit exam (MD exam) that is composed of a written component and a clinical component. Before the COVID-19 pandemic, the exams used to be conducted totally on campus at the prepared venues of the Arabian Gulf University and in the AGU affiliated hospitals.

Population and Sample Size

The target population was the students in Year 6 who sit for the MD exam (158 students). However, 124 students only (78.5%) responded to the survey. In addition, 60 full- and part-time faculty who were involved in the MD exam (40 examiners and 20 exam writers/coordinators) were included in the study.

Data Collection

Data were collected through three tools as follows.

Surveys

After the exam, evaluation data was collected through two short, semi-structured surveys designed by the authors, one for the students and another for the examiners. The surveys were developed and revised by all authors after reviewing the relevant literature and similar studies. Then the surveys were piloted on a few participants. The surveys employed 5-point Likert scales (Strongly Agree = 5, Agree = 4, Neutral = 3, Disagree = 2, and Strongly Disagree = 1).

Structured Interview for Heads of Clinical Departments

A structured interview was conducted with the heads of clinical departments (n = 6). The interview focused on exploring their viewpoints regarding pre-established themes which are: (1) the extent of the online exam’s ability to assess students’ clinical skills, (2) the exam’s ability to replace the in-person clinical exams, and (3) extent of coverage of the clinical content of the rotations by the exam.

Documents’ Review and Calculation of Examination’s Psychometrics

The documents related to planning and conduction of the clinical examinations were reviewed to obtain data on the preparation of the students and examiners and the mock exams conducted. Also, validity and reliability studies were conducted through calculation of correlation coefficients and internal consistency of exam results.

Statistical Analysis

Quantitative data was analyzed using SPSS v.26. Simple descriptive statistics were used. Data were presented in the form of means and standard deviations. Responses of heads of clinical departments in the structured interview were transcribed and analyzed thematically based on three pre-established themes by four of the authors (MA, MHK, HA, and AK).

Ethical Approval

Informed consent was obtained from all the participants after explaining the process in detail. All the data regarding OSCE stations of the online exam, students, and assessors were maintained confidential. The study was approved by the Research and Ethics Committee of CMMS-AGU (E002-PI-6/20).

The Exam

Preparatory Phase

The process of digital adaptation of clinical exam was a complex task consisting of series of interconnected steps, facilitated by teamwork and collaboration, as described below: Institutional readiness – Availability of digital infrastructure, functional e-learning unit and dedicated faculty empowered all stakeholders to plan forward with confidence. Getting everyone on board – This transition process needed consensus of all stakeholders. It is recommended to remain transparent and spell out the expectations and limitations of online exams. Individual meetings were held with college leaders/curriculum committee/department heads/technical team/support staff and student representatives. Effective utilization of the infrastructure – Venue containing all assessment rooms placed in same corridor/same floor was selected to facilitate logistics and communication. High-speed stable internet connectivity (preferably LAN – Local Area Network), big LCD screen for better monitoring of all stations by members of exam team, electric bell, stopwatch, additional back up rooms, computers, web cameras, speakers, extension boxes, additional furniture, etc. were procured accordingly. Selection of task force – The members were selected across all departments and units and grouped under three teams namely faculty, secretarial staff, and technical support. Formal assignment was guaranteed by the college leadership to ensure availability of staff in exam dates. Capacity building – Taking into consideration the diversity of team members and required competencies of faculty and students, several training sessions with focused hands-on training and instructional videos were conducted. Complex protocols were translated to simple checklists, specific for each task namely host, co-host, faculty invigilators, pretest invigilators, posttest invigilators, secretaries, assessment unit staff, technical support staff and so on. Customization of tool – Educational Institutional License for Zoom® was purchased, and settings were customized to match the requirements of the online exams. Mock exams – Multiple mock sessions were conducted to improvise the checklists, streamline the student allotment, recognize potential areas of “trouble shooting”, identify students with poor connectivity and faulty gadgets. Development of OSCE stations – A blueprint was used for the selection and development of stations. Simulated patients were used wherever needed. The OSCE stations focused on range of clinical skills like history taking, decision-making, communication, interpretation, diagnostic skills, and patient management. Assessment of physical examination skills continued to be a challenge. Validated checklists were used by assessors. Communication – Constant and clear communication was provided to all stakeholders through formal emails, assessors’ briefings, printed checklists, pre-/post-exam briefing and informal WhatsApp groups.

Implementation Phase

Two hours before each exam, technical team prepare the virtual panels (renaming break out rooms, disabling live chat among students, enabling screen sharing, enabling video recording for examiners, checking audio/video, etc.). Small cohort of students receive the link for the exam in a sequential manner just 15 minutes before the scheduled time. Once all students are admitted to the meeting, they were transferred to a virtual pre-exam room for identity check before being assigned to virtual exam panels. After exams, the examined cohort of students is shifted to a virtual post-test room for sharing their feedback while the new cohort is assigned to exam panels (coming from the pre-exam virtual rooms). The same cycle was repeated till all students are examined. For every two cohorts the OSCE questions were changed to ensure confidentiality of exams. The comprehensive data regarding the schedule of online OSCE exams (final MD - 2019) is shown in Box 1.
Box 1

Online OSCE Schedule

Date of ExamDepartmentExam TypeTotal Number of OSCE StationsNumber of Viva StationsNumber of Stations with SPsNumber of ExaminersNumber of Students
7 and 9 July 2020All Clinical DepartmentsMock Exam11120158
14 to 16 and 19 July 2020Final MD Exam60201454158

Abbreviations: OSCE, Objective Structured Clinical Examination; SPs, Simulated Patients; MD, Medical Doctor.

Online OSCE Schedule Abbreviations: OSCE, Objective Structured Clinical Examination; SPs, Simulated Patients; MD, Medical Doctor.

Potential Challenges and Feasible Solutions

Despite careful planning, there were few challenges which were encountered during the implementation of the online OSCE exams. The challenges along with strategies to overcome them are shown in Box 2.
Box 2

List of Challenges Encountered During Online OSCE Exams and Strategies Followed to Overcome Them

No.ChallengesStrategies
1Possibility of wrong questions being shared with the students (since same computers were used for multiple exams)Delete the questions at the end of each exam
2Quality assuranceRecord and save all the exams, allow invigilator to visit all breakout rooms
3Unexpected absenteeism of task forceWell-planned back up support
4Unexpected technical failureBackup – speakers, cameras, computers, and rooms to be kept ready
5Error in naming the Zoom® displayStick the correct panel number and name of the examiner on the door for every exam.Instruct technical team to rename the Zoom® accordingly
6Students might leave Zoom® before next cohort is admittedCommunicate the process beforehand to all students and share it in the form of official circular
7Unavoidable delay due to technical reasonsCommunicate to students that they should be available if called early and to be prepared to wait in case of unavoidable delay
8Error in assigning the students to correct panels of breakout roomsPrint the sequence of allotment with names and pictures
9Unexpected disappearance/absenteeism/“failure to login” by studentsPolicy decision in agreement with HODs/Vice-Deans/Deans
10Suspected behaviors/untoward incidentsDocument in an “Incident Report Form” and submit to assessment unit

Abbreviations: OSCE, Objective Structured Clinical Examination; HODs, Heads of Departments.

List of Challenges Encountered During Online OSCE Exams and Strategies Followed to Overcome Them Abbreviations: OSCE, Objective Structured Clinical Examination; HODs, Heads of Departments.

Results

Demographic Characteristics

Most of the students who participated in the study (60%) were Bahraini nationals, followed by Kuwaitis (21%) and Saudis (13%), while other nationalities constitute the remaining small percentage (Figure 1).
Figure 1

Demographic distribution of the students (n = 124).

Demographic distribution of the students (n = 124). Regarding the examiners, most of them were males (70%). The examiners came from six clinical departments, with equal percentages (20%) from each of Internal Medicine, Surgery, Pediatrics, and Obstetrics and Gynecology departments and smaller percentages (10%) from each of Family Medicine and Community Medicine and Psychiatry departments. The examiners had varying years of experience that ranged from one year to more than thirty years (Table 1).
Table 1

Demographic Profile of Examiners (n = 40)

ParameterNumber (%)
Gender:
 Male28 (70%)
 Female12 (30%)
Department:
 Internal Medicine8 (20%)
 Surgery8 (20%)
 Pediatrics8 (20%)
 Obstetrics and Gynecology8 (20%)
 Family and Community Medicine4 (10%)
 Psychiatry4 (10%)
Teaching Experience (Years):
 1 to 1014 (35%)
 11 to 2013 (32.5%)
 21 to 3011 (27.5%)
 31 to 402 (5%)
Demographic Profile of Examiners (n = 40)

Utility of Online OSCE

To assess the utility of online OSCE, quantitative and qualitative data were collected on five criteria of utility, namely, validity, reliability, acceptability, educational impact, and cost and feasibility.

Validity

Content Validity

Content validity was established through a short survey that targeted the examiners. The results of that survey are presented in Table 2.
Table 2

Means and Standard Deviations of Examiners’ Responses to Items Evaluating the Content of the Online Clinical Exam*

No.StatementMean (±SD)Range
1The exam reflects real practice4.50 (±0.85)1–5
2The exam samples the curriculum well4.55 (±0.64)3–5
3Evaluation form is well-structured and easy to use4.60 (±0.59)3–5
4This type of exam can assess clinical skills of students other than physical examination4.20 (±0.88)2–5

Note: * Table 2 is a complementary part of Table 5 (below) that was separated for organizational purposes.

Abbreviation: SD, Standard Deviation.

Means and Standard Deviations of Examiners’ Responses to Items Evaluating the Content of the Online Clinical Exam* Note: * Table 2 is a complementary part of Table 5 (below) that was separated for organizational purposes.
Table 5

Means and Standard Deviations of Examiners’ Responses to Items Evaluating the Online Clinical Exam Experience

No.ItemMean (±SD)Range
Pre-Assessment Preparation:
1I was clearly informed about the rules, regulations, and instructions of online clinical exams4.80 (±0.52)3–5
2Mock online clinical exams prepared me sufficiently for the real exam4.68 (±0.66)2–5
3I am satisfied with the communication process with CMMS-AGU before and during the exam4.68 (±0.69)2–5
Exam Process:
4Briefing done before the exam was helpful4.75 (±0.59)2–5
5The technical support personnel at the exam rooms were adequate4.83 (±0.45)3–5
6Physical distance instructions were respected at the exam rooms4.25 (±1.13)1–5
7The exam material shown during online clinical exam (images/ case scenarios/ role players) was appropriate4.63 (±0.71)2–5
8Duration of the whole exam was suitable4.65 (±0.86)1–5
Overall Satisfaction:
9I am satisfied with the overall organization and implementation of the CMMS-AGU online clinical exam4.68 (±0.57)3–5

Abbreviations: CMMS-AGU, College of Medicine and Medical Sciences, Arabian Gulf University; SD, Standard Deviation.

Abbreviation: SD, Standard Deviation. Qualitative Analysis of the Structured Interviews with the Heads of Clinical Departments Abbreviation: OSCE, Objective Structured Clinical Examination. Means and Standard Deviations of Students’ Responses to Items Evaluating the Online Clinical Exam Experience Abbreviations: CMMS-AGU, College of Medicine and Medical Sciences, Arabian Gulf University; SD, Standard Deviation. Means and Standard Deviations of Examiners’ Responses to Items Evaluating the Online Clinical Exam Experience Abbreviations: CMMS-AGU, College of Medicine and Medical Sciences, Arabian Gulf University; SD, Standard Deviation. Examiners participating in the online clinical exam highly valued the representativeness of the exam and its degree of reflecting real practice. They thought that the content properly sampled the content of the curriculum. However, the lowest mean scores were for the ability of the online OSCE to assess clinical skills (4.20±0.88). The heads of clinical departments expressed their enthusiasm and understanding of the new online exam. They agreed on its effectiveness to evaluate the communication skills, content-specific knowledge, and history-taking skills as partial reflection of clinical skills. However, this process is justifiably not a perfect replacement of the in-person assessment of the psychomotor skills. Moreover, the heads of clinical departments, except Psychiatry, estimated the percentage of clinical skills covered by the online exam to be 70–80%. The head of Psychiatry department argued that the exam covers even more than 90% of the content, due to the uniqueness of the mental state examination that relies on communication skills and does not depend on physical examination skills that need in-person contact between the student and the patient (which is highly missed in other specialties). Overall, the heads of clinical departments concluded that the online OSCE exam covered a broad range of course content and included a representative array of common problems (Table 3).
Table 3

Qualitative Analysis of the Structured Interviews with the Heads of Clinical Departments

ThemesSubthemesRepresentatives Quotes
Theme 1:The extent of the online exam’s ability to assess students’ clinical skills.● Online OSCE as an emergency replacement.● Scope of skills that can be assessed by online OSCE.“This experience was positive in assessing multiple clinical skills distantly. However, assessing physical examination skills was not possible”“I believe that clinical reasoning, history taking, and communication skills were all assessed at a convenient level, especially in stations that included simulated patients”
Theme 2:Exam’s ability to replace the in-person clinical exams.● Temporally rather than permanent replacement.● Trust of students and schools in online OSCE as a replacement.“Online OSCE is not a perfect replacement of traditional OSCE except in such emergency situations”“Based on their informal feedback, students were in favor of this form of clinical examination as it considered their safety”
Theme 3:The extent of coverage of the clinical content of the rotations by the exam.● Range of coverage of common health problems.● Matching between clinical content taught and assessed.“Psychiatry is a unique clinical specialty that relies mainly on communication skills, content-specific knowledge, and mental state examinations. Almost all these skills (more than 90%) are possible to perform online. However, assessment of disorders that require in-person clinical interactions, like psychomotor retardation/agitation, tension, consistent low mood, fine tremors … etc., are not possibly conducted remotely”“In our online clinical examination, we covered around 70–80% of the curriculum-aligned topics”

Abbreviation: OSCE, Objective Structured Clinical Examination.

Criterion Validity

The online OSCE excellently correlated with the online written assessment of the same batch of students (r = 0.75). However, it was poorly correlated to the previous conventional clinical assessment marks of the same batch of students (r = 0.27).

Reliability (Internal Consistency)

Across the four days of clinical exams the Cronbach’s alpha of students marks in the OSCE stations ranged between 0.70 and 0.78 with a standard error of measurement that ranged between 2.69 and 2.99 at a confidence level of 95%.

Acceptability

Acceptability of the exam among the medical students and examiners has been explored through two survey questionnaires that were distributed just after the exam: one for the students and the other for examiners. The results of the survey are presented in Tables 4 and 5.
Table 4

Means and Standard Deviations of Students’ Responses to Items Evaluating the Online Clinical Exam Experience

No.ItemMean (±SD)Range
1I was clearly informed about the rules, regulations, and instructions of online clinical exams4.93 (±0.32)3–5
2The mock online clinical exams prepared me on what to expect in the real exams4.39 (±0.72)2–5
3The communication process with CMMS-AGU before and during the exam was efficient4.85 (±0.40)3–5
4The transition between pre-exam, exam rooms, and post-exam room was smooth and quick4.63 (±0.82)1–5
5Duration of the sessions and format of the online exams matched my expectation4.67 (±0.67)1–5
6Examiners and invigilators handled the exam professionally4.69 (±0.69)1–5
7The exam material shown during online clinical exam (images/ case scenarios/ role players) was appropriate4.42 (±0.79)2–5
8Except for physical examination skills, the online exam was a suitable method to assess my clinical skills4.60 (±0.84)1–5
9I am satisfied with the overall organization and implementation of the online clinical exams4.70 (±0.58)2–5

Abbreviations: CMMS-AGU, College of Medicine and Medical Sciences, Arabian Gulf University; SD, Standard Deviation.

Table 4 shows high mean scores of student responses to all evaluation items. The highest mean score was for the item representing informing the students about the rules, regulations, and instructions of online clinical exams (4.93±0.32). The lowest mean score was for the item representing the benefit of the mock exam in preparing the students for the real exam (4.39±0.72). Importantly, the students tend to believe that the online exam was a suitable method to assess their clinical skills other than the physical examination skills (4.60±0.84). Overall, the students were highly satisfied by the overall organization and implementation of the online clinical exams (4.70±0.58). Table 5 shows high mean scores for all evaluation items. The highest mean score was for the item representing adequacy of the technical support (4.83±0.45). However, addressing respecting the physical distancing instructions had the lowest mean score (4.25±1.13). Overall, the examiners were highly satisfied by the overall organization and implementation of the online clinical exams (4.68±0.57).

Educational Impact

As the students were inducted to the purpose of this clinical assessment, it positively affected their approach to learning through focusing on achieving the required clinical competencies either in clinical training or while preparing for the online OSCE. Students revealed that having multiple mock exams using the same method, together with the orientation sessions conducted by the clinical department, enlightened them about the expected level of performance. Students focused on the approach to common clinical presentations, interpretation of lab and radiology findings, as well as using communication skills for data gathering and providing explanations to role players.

Cost and Feasibility

Cost and feasibility of the online OSCE were acceptable as the planning team managed to use the available resources to conduct the exams. The preparation for the online exam consumed two months of almost daily work of the planning team, heads of clinical departments, and case writers. Additional time was mainly spent on the arrangement of the venue for the online exam and, most importantly, the training of the examiners on using technology to conduct the online OSCE. The cost could be estimated by transforming the total faculty time expenditure into US Dollars (USD), which was around 50,000 USD. Other costs were purchasing four laptop computers for meeting hosts (6000 USD) and an annual educational subscription in an online meeting platform to help conducting the meetings (3600 USD).

Discussion

The CMMS-AGU clinical departments held online OSCE MD exams during the COVID-19 lockdown. The OSCE stations were inclusive of various competencies including history taking, interpretation of investigation results, clinical communication, patient management, and clinical reasoning. Each student had to pass through two stations in each of Internal Medicine, Surgery, Pediatrics, and Obstetrics and Gynecology as well as one station in each of Psychiatry and Family Medicine. This study aims to evaluate the utility of the CMMS-AGU online MD OSCE exam. A sample of diverse stakeholders of this exam was surveyed, including students, examiners, medical education experts, and heads of clinical departments. Psychometrics of the exam along with data from the final exam report were also used to complement the evaluation on the five criteria of utility, namely: validity, reliability, acceptability, educational impact, and cost and feasibility. Discussion of the results of this study is presented here under five sections representing the five criteria of utility: Medical education experts and representatives from all clinical departments reported that almost all the targeted important clinical competencies were tested in this online OSCE, except for physical examination skills. This was validated by the input of the heads of clinical departments who confirmed the proper coverage of most of the clinical skills by online OSCE. Similar results were reported by Shaban et al,13 who reported high content and face validity of their online exams that covered the most important clinical competencies except physical examination. Furthermore, criterion validity study of the CMMS exam supports online OSCE as a valid tool for clinical assessment. It shows good correlation with the online written assessment of the same batch of students. Similar results were reported by Hamdy et al,14 who found high convergent validity between their online clinical examination tool and another previously validated tool that measures the same clinical competencies. However, weak correlation was detected between students’ results in online OSCE and their previous conventional clinical assessment. This is congruent with the results reported by Hasani et al15 in a similar study. One of the explanations for this weak correlation might be the difference in the clinical assessment methods used. Cronbach’s alpha levels indicated an acceptable level of reliability for this online OSCE. Similar results were reported by Shaban et al,13 who reported comparable Cronbach’s alpha values between online and conventional OSCE at their institution. Furthermore, Felthun et al16 reported in their systematic scoping review that teleOSCE (online OSCE) could be able to improve the reproducibility of clinical assessments. It was indicated by students, examines and heads of clinical departments that online OSCE is an acceptable tool for assessing clinical skills other than physical examination skills. In a similar study in Saudi Arabia by Shaiba et al,17 electronic OSCE (e-OSCE) was conducted in Pediatrics rotation and most of the respondents were very comfortable with this new virtual experience. More than half of the participants in that study even preferred the e-OSCE compared to the classic face-to-face clinical OSCE during the pandemic, which is supported also by a similar study by Elnaem et al.18 In another study, Shaban et al13 also found that implementing online OSCE was acceptable by the students and faculty members. This was also highlighted by Palmer et al19 in a pre-COVID study on the feasibility and acceptability of an online OSCE as well as Kakadia et al20 in a study on implementing online OSCE during the COVID-19 pandemic. Furthermore, Hamdy et al14 reported high satisfaction by both medical students and faculty members by the model of virtual clinical encounter examination they designed and implemented as an alternative for traditional OSCE. This online OSCE experience had a high educational impact. Through inducting the students to the purpose of this online assessment and the specific clinical competencies they need to achieve, this experience positively affected their approach to learning and guided the students toward paying more attention to the important components of common clinical presentations (like interpretation of lab and radiological findings and using communication skills in eliciting and providing information to simulated patients). This effect was mainly obtained by using multiple mock exams and orientation sessions conducted prior to the final MD exams. This is in congruence with the results of a study by Ganesananthan et al,21 who reported that the overall level of performance and learning of their students was more positively affected by the mock exam conducted prior to the final assessment. We think that an added positive educational impact of this experience is training students on using technology in learning and, later after graduation, in health care. This is supported by the findings of Felthun et al16 in their systematic scoping review as they reported that teleOSCE could be able to equip the students with the requisite skills of practicing telemedicine in the future. In addition to the available tangible and intangible resources available at the campus of the institution, the online OSCE conducted for the whole batch of students was affordable. The available human resources (faculty and assistant staff), campus offices, internet line, and computers were almost enough for successfully conducting the exam. Only a few computers and an annual educational subscription in an online meeting platform were needed for the completion of the online exam process. This is congruent with the results of a study by Shaban et al,13 who reported feasibility, cost effectiveness, practicality, and availability of needed resources locally within the institution. They reported that the only drawback is the inability to assess students’ physical examination skills, which is an understandable and unavoidable drawback of online clinical assessment in general. This is the same drawback that we encountered in our initiative. However, to partially compensate for this drawback, our examiners asked the students to tell what they would examine and how if they have in-person contact with the patients, as was done by Luke et al22 in their trial of implementing clinical competence evaluation in nursing.

Conclusion

The stakeholders’ evaluation of the online OSCE in CMMS-AGU was reassuring considering all five components of utility (which are validity, reliability, acceptability, educational impact, and cost and feasibility). This indicates that online OSCE might be a good alternative of conventional clinical assessments in times of crises and impossibility of having in-person contact between students, examiners, and patients. However, a major drawback of online OSCE is its inability to assess students’ physical examination skills which require physical contact between students and the patients. We believe that future technological advances in the field of haptic technology, as well as virtual and augmented reality, may provide a reasonable alternative for assessment of physical examination skills, where students can remotely touch and feel patients to physically examine them.
  17 in total

1.  The feasibility and acceptability of administering a telemedicine objective structured clinical exam as a solution for providing equivalent education to remote and rural learners.

Authors:  Ryan T Palmer; Frances E Biagioli; Jasminka Mujcic; Benjamin N Schneider; LeNeva Spires; Lisa G Dodson
Journal:  Rural Remote Health       Date:  2015-12-03       Impact factor: 1.759

2.  COVID-19 and medical education.

Authors:  Hanad Ahmed; Mohammed Allaf; Hussein Elghazaly
Journal:  Lancet Infect Dis       Date:  2020-03-23       Impact factor: 25.071

Review 3.  Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review.

Authors:  Jonathan Zachary Felthun; Silas Taylor; Boaz Shulruf; Digby Wigram Allen
Journal:  J Educ Eval Health Prof       Date:  2021-06-01

4.  Malaysian pharmacy students' perspectives on the virtual objective structured clinical examination during the coronavirus disease 2019 pandemic.

Authors:  Mohamed Hassan Elnaem; Muhammad Eid Akkawi; Nor Ilyani Mohamed Nazar; Norny Syafinaz Ab Rahman; Mohamad Haniki Nik Mohamed
Journal:  J Educ Eval Health Prof       Date:  2021-04-12

5.  Virtual Clinical Encounter Examination (VICEE): A novel approach for assessing medical students' non-psychomotor clinical competency.

Authors:  Hossam Hamdy; Jayadevan Sreedharan; Jerome I Rotgans; Nabil Zary; Sola Aoun Bahous; Manda Venkatramana; Elsayed AbdelFattah Elzayat; Pankaj Lamba; Suraj K Sebastian; Noha Kamal Abdel Momen
Journal:  Med Teach       Date:  2021-06-15       Impact factor: 3.650

6.  Virtual Evaluation of Clinical Competence in Nurse Practitioner Students.

Authors:  Sheba Luke; Elizabeth Petitt; Josie Tombrella; Erin McGoff
Journal:  Med Sci Educ       Date:  2021-05-24

7.  Pandemics and Their Impact on Medical Training: Lessons From Singapore.

Authors:  Zhen Chang Liang; Shirley Beng Suat Ooi; Wilson Wang
Journal:  Acad Med       Date:  2020-09       Impact factor: 7.840

8.  Evaluation of Curricular Adaptations Using Digital Transformation in a Medical School in Arabian Gulf during the COVID-19 Pandemic.

Authors:  Archana Prabu Kumar; Ahmed Mohammed Al Ansari; Mohamed Hany Kamel Shehata; Yasin Ibrahimm Yousif Tayem; Mona Rushdi Khalil Arekat; Adel Abdulrahim Mohammed Kamal; Abdelhalim Deifalla; Khaled Saeed Tabbara
Journal:  J Microsc Ultrastruct       Date:  2020-12-10

9.  Comparison of electronic versus conventional assessment methods in ophthalmology residents; a learner assessment scholarship study.

Authors:  Hamidreza Hasani; Mehrnoosh Khoshnoodifar; Armin Khavandegar; Soleyman Ahmadi; Saba Alijani; Aidin Mobedi; Shaghayegh Tarani; Benyamin Vafadar; Ramin Tajbakhsh; Mehdi Rezaei; Soraya Parvari; Sara Shamsoddini; David I Silbert
Journal:  BMC Med Educ       Date:  2021-06-13       Impact factor: 2.463

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.