Literature DB >> 35140510

Innovative Method to Digitize a Web-Based OSCE Evaluation System for Medical Students: A Cross-Sectional Study in University Hospital in Saudi Arabia.

Abdullah A Yousef1,2, Bassam H Awary1, Faisal O AlQurashi1, Waleed H Albuali1, Mohammad H Al-Qahtani1, Syed I Husain2, Omair Sharif2.   

Abstract

PURPOSE: The Objective Structured Clinical Examination (OSCE) is a standard academic assessment tool in the field of medical education. This study presents an innovative method for digitizing OSCE evaluation system for medical students and explores its efficacy compared to the traditional paper-based system, through the analysis of a User Satisfaction Survey.
METHODS: A cross-sectional, questionnaire-based study involving a User Satisfaction Survey to evaluate assessors' attitudes toward and acceptance of the Computerized Web-based OSCE Evaluation System (COES) was used. Fifth-year medical students at a College of Medicine were assessed clinically through their 2019 end-of-year OSCE examination by 30 examiners in five different OSCE stations. The traditional paper-based stations were converted into an online electronic version using QuestionPro software. Answers were filled in using smart tablets (iPads). QR codes were used for students' identification at each station to fully digitize the process and save time. After the completion of the exam, a User Satisfaction Survey was sent electronically to all examiners to evaluate their experiences with the new system.
RESULTS: The response rate for the survey was 100% with an internal consistency of 0.83. Almost all the examiners (29, 97%) were satisfied with the application of the COES. Further, 72% of the examiners indicated that the electronic system facilitated the evaluation of the students' skills, and 84% found using a smart device (iPad) was easier than using a paper form. All examiners expressed their preference for using the electronic system in the future.
CONCLUSION: Users were satisfied with the utilization of the customized COES. This concept of fully digitizing the OSCE assessment process shortened the time needed for both the analysis of results and providing students with feedback. Further observational studies are needed to assess examiners' behaviors when using this methodology.
© 2022 Yousef et al.

Entities:  

Keywords:  Saudi Arabia; academic performance; clinical competency; clinical skills; medical education; undergraduate

Year:  2022        PMID: 35140510      PMCID: PMC8820456          DOI: 10.2147/IJGM.S351052

Source DB:  PubMed          Journal:  Int J Gen Med        ISSN: 1178-7074


Introduction

The Objective Structured Clinical Examination (OSCE) is a standard academic assessment tool in the field of medical education. It is designed to evaluate practical and communicative skills among medical students.1 Some of its advantages include building examinee confidence and evaluating their clinical sense in different settings. Among its disadvantages are the significant time expenditure incurred and possible individual documentation errors. In 1975, Haden et al introduced OSCE as a method to evaluate medical students’ skills.2 Since then, it has been used as a mode for assessing the skills and clinical competence of almost all healthcare practitioners.3 It is a timed examination in which students move systematically through a set of stations that are pre-determined and evaluated by a qualified examiner, using well-structured marking criteria.4 Historically, the concept of the OSCE examination has been subjected to numerous modifications throughout the years to better suit specific academic purposes.5,6 In most well-known colleges of medicine worldwide, OSCE is the standard tool for the evaluation of competency, clinical skills, communication skills, psychomotor skills, cognitive knowledge, and attitude through oral examination, counseling, data interpretation, and history and physical examination stations.7–10 The extent of the traditional clinical exam focused on the patient’s history and a demonstration of physical examination skills, with a minimal assessment of technical skills. The traditional clinical exam is broadly unreliable and unjust in evaluating students’ performance because of the wide range of variability concerning both the examiners and the selected patients.11 “The luck of the draw” in the selection of examiner and patient plays a significant negative role in the outcome when using the traditional method.12 Since the introduction of the concept of OSCE in 1975, researchers’ findings reported it to be reliable, objective, and valid with the cost and requirement for human resources being its main disadvantages.7–12 In OSCE, all students are examined on preselected criteria determined by a team of faculty teachers. Assessment of similar clinical scenarios or tasks with scores to meet specific criteria is undertaken. The diversity of stations, performance outcomes, degree of difficulty of questions, and overall students’ organization are some of the important parameters that can be used to analyze the teaching standards of the institution objectively. During the exam, students’ performance is judged by a team of examiners in charge of the various stations of the examination. Furthermore, OSCE is time-efficient, examining more students at any given time over a broader range of subjects than the traditional clinical exam.13 Classically, a paper-based methodology was the standard when executing OSCEs. However, several issues have been linked to this method, including illegible handwriting, missing students’ details, lost assessment sheets, individual manual calculation inaccuracy, data entry errors, and time consumption. Additionally, feedback and prospective input regarding students’ performance are not usually shared with students due to time limitations.14 Few published papers highlight practical approaches to overcome these shortfalls. Most of these efforts addressed the core deficits of the traditional paper-based OSCE assessment, including digital use of different software programs,14,15 use of computers and electronic handheld devices,16–19 and use of a web-based evaluation system.20 To our knowledge, this is the first audit addressing the topic of OSCE assessment of students using an electronic strategy involving the designing and implementing of online handheld digital OSCE assessment software. This study aims to present a novel academic assessment tool using a Customized Web-based OSCE Evaluation System (COES) for medical students and explore its efficacy in comparison to the traditional paper-based evaluation system, using a User Satisfaction Survey.

Materials and Methods

This is a web-based cross-sectional study that utilized a previously developed User Satisfaction Survey to explore the efficacy of the digitized OSCE system compared to the traditional paper-based system.16,22 As part of the Children’s Health academic requirement, fifth-year medical students at the College of Medicine must be assessed clinically through an end-of-year OSCE examination following their Children’s Health course. The e-learning unit at the College of Medicine decided to digitize the OSCE examination for all medical students to meet its strategic academic plans, and the Department of Pediatrics was chosen for the trial. Numerous meetings and brainstorming sessions were held to gauge such requirements and how effectively/efficiently they could be conducted utilizing available resources, customized to the needs of the department. QuestionPro is web-based software for creating and distributing surveys. It consists of an interface for creating survey questions, tools for distributing surveys via email or website, and tools for analyzing and viewing the results.21 To fit our purpose, as an OSCE management solution, we created a full exam platform by utilizing survey features and went one step further by using QR codes. QuestionPro has been “globally recognized by multiple educational, business, research, and marketing institutes for over ten years.”21 Assessment documentation, stations selection, and scoring criteria were chosen, formulated, reviewed, and agreed on by the OSCE committee faculty members of the department. Subsequently, all information was handed to the e-learning support team to be uploaded to the newly generated assessment system using QuestionPro software. Prior to the OSCE date, OSCE assessors, circuit coordinators, and student invigilators were trained to use the electronic system, and technical support was available at the time of the OSCE assessments. Further, an introductory session was held to introduce the new electronic system to the students. Regarding OSCE scoring, we provided assessors with 3–5 scoring options for each question of the assessment with which to rate each student’s performance: Not Done, Inadequately Done, Partially Done, Adequately Done, and Well Done. We assigned different scoring weights to each question based on its difficultly, complexity, and number of variable answers for each question (Figure 1).
Figure 1

Software Question Logic page which shows the Question (at the top), and Answer options with their score values.

Software Question Logic page which shows the Question (at the top), and Answer options with their score values. This COES, which was customized “in-house” at the e-learning unit of the College of Medicine, was used to store and analyze data electronically. Moreover, student feedback was sent to students electronically using the student email system. This study was approved by the Institutional Review Board (IRB) of Imam Abdulrahman Bin Faisal University (IRB-2020-01-048). The datasets used and/or analyzed in the current study are available from the corresponding author on request. All participants provided informed consent to take part in this study, in accordance with the Declaration of Helsinki.

OSCE Layout for the Students

A total of 139 fifth-year medical students utilized the COES in December 2019. They were assessed by 30 examiners from the faculty board of the Department of Pediatrics using portable tablets (iPads) that were provided by the Deanship of e-learning. The examiners represent part of the faculty staff of the Department of Pediatrics with cumulative academic and students’ assessment experience of more than 80 years. All of the involved examiners are consultants from various pediatric sub-specialties with either local or/and international fellowship training certifications. The selection of thirty examiners was subjected to the layout of OSCE based on number of stations, circuits, and rotations. The OSCE comprised five separate stations. Students were divided into three parallel circuits (A, B, and C) operating simultaneously to accommodate numerous examinees. Each circuit comprised the same five stations in the same systematic order. A range of 12–14 students was assigned before the exam to four rotations per circuit. This distribution of students was meticulously generated using Excel Microsoft software and reviewed by three different members of the OSCE exam committee to eradicate any individual/technical errors. Each student completed a history taking and discussion, pediatric surgery case scenario, data interpretation, physical examination, and counseling station, each of which was allocated eight minutes. Along the presence of the examiner, the history taking station was supported by a pediatric resident acting as a simulated parent, as well as the counseling station. Regarding the physical examination station, a selected group of pediatric patients were approached by the OSCE coordination committee for participation and enrollment into the examination.

The Computerized Web-Based OSCE Evaluation System

The duration of the station was determined by the required time for the student’s assessment, for the student’s QR code to be scanned, and a two-minute safety zone should any electronic issue arise. A QR color-coded ID card was given to each student before entering the exam to be scanned by each assessor using their iPads at the beginning of the station (Figures 2 and 3).
Figure 2

Flowchart for OSCE Exam using QuestionPro which starts from Scanning the student’s QR code and ends at submitting the electronic assessment form.

Figure 3

Difference between QuestionPro and other online OSCE software. Probability of human errors may happen at the stage of logging into examiner’s page, selecting the student from a dropdown list, or absence of cross checking the student’s details prior to submitting the electronic assessment form.

Flowchart for OSCE Exam using QuestionPro which starts from Scanning the student’s QR code and ends at submitting the electronic assessment form. Difference between QuestionPro and other online OSCE software. Probability of human errors may happen at the stage of logging into examiner’s page, selecting the student from a dropdown list, or absence of cross checking the student’s details prior to submitting the electronic assessment form. The coded card showed the student’s data (Name, University number) and was encrypted to match the assigned circuit, assessor, and rotation for each student. Once the assessor scanned the QR code on each student’s ID card, an online designed page opened on the assessor’s tablet showing the student’s data, circuit, and rotation numbers for second step verification (Figure 4). Subsequently, the assessor was asked to choose their assigned station number from the five stations shown on that page, which also indicated the relevant assessors’ names under each station. After selecting a station, the assessor graded the performance of the student within the given period (Figure 5).
Figure 4

User Interface Introduction page which shows the student’s details, circuit and rotation number, and the names of the Examiners.

Figure 5

An example of the OSCE assessment page (Questionnaire sheet).

User Interface Introduction page which shows the student’s details, circuit and rotation number, and the names of the Examiners. An example of the OSCE assessment page (Questionnaire sheet). Once the time was up, the assessor submitted the form, and data were recorded in the system. After submission of the performance questionnaire, an automated text appeared containing the student data and a message confirming the submission of the questionnaire. Notably, the submission of performance forms was only allowed when all questionnaire items were completed to overcome missing data potential. Finally, the software can be used to download the raw data results in different formats (MS Excel, MS PowerPoint, Adobe PDF) during or after the completion of the exam. The serial number can be used to merge the data for each student in a single MS Excel file, and the sum and average formulas can be added manually through the MS Excel function “sort”.

Overall User Satisfaction Survey

After the completion of the exam, we asked the assessors to complete an overall satisfaction survey of their experience with the COES. The survey was originally developed from a previous work following an extensive literature review in the field of electronic OSCE management, expert review and agreement of the educationalists involved in OSCE preparation.16,22 A 25-item-questionnaire was divided into three sections: the OSCE Software user evaluation (3 Items); usage of the electronic OSCE system and its training (10 Items); and the OSCE assessment process itself (12 Items). For the questions related to the usage of the COES and the assessment process itself (22 Items), they chose from the following: Strongly agree, Agree, Disagree/strongly disagree, or No judgment. For the OSCE software user evaluation (3 Items), the choices were: Excellent, Good, Fair, or Poor. Additionally, three questions were added for a better analysis of the process: Assessor’s age, gender, and whether they possessed a tablet device at home. The assessors’ answers were recorded and analyzed.

Results

Two-third of the assessors were male (22, 73.3%), and the majority were above the age of 40 years (13, 43.3%), with six examiners (20%) being above the age of 50 years, and the rest were between the age of 30 to 39 years. Most assessors have personal tablets at home (23, 76.6%), with no observed statistical significance compared to their age or gender. The overall average time for completing all the students’ assessments was five minutes (out of a possible eight minutes).

User Satisfaction Survey for Assessors

All the assessors who received the survey electronically completed it with a response rate of (n = 30, 100%). The internal consistency (Cronbach’s alpha) of the survey was 0.83. In classical psychometric terms, internal consistency was preferred to best.23 All examiners had previous experience with assessing students using paper-based OSCE methodology. Answers that showed “No judgment” were excluded during the descriptive analysis since there was no preference regarding either agreement or disagreement and they were not considered significant when forming a conclusion for a satisfaction survey. Results of the satisfaction survey for assessors are displayed in (Table 1).
Table 1

User Satisfaction Survey Results (N= 30)

Section A:Training for and usage of the Computerized Web-based OSCE Evaluation System (COES)Strongly agreeAgreeDisagree or strongly disagreeNo judgment
 1. The training sessions prepared me sufficiently to use the system20901
 2. It was easy to access the computer system21900
 3. The screen layout and instructions were clear23700
 4. The computer system was easy to use23700
 5. I felt confident using the computer system22800
 6. Multiple-choice markings were easy to navigate18903
 7. There was good technical support provided to help me with any problems26400
 8. Using the system enabled me to complete my work as an examiner efficiently21711
 9. It was easy to exit from the program121512
 10. I would like to use the electronic system for future OSCEs21702
Section B:The OSCE assessment processStrongly agreeAgreeDisagree orstrongly disagreeNojudgment
 11. Using the computer system facilitated the assessment of the students’ skill9975
 12. The outlined steps of the skill were appropriate to the skill being assessed121800
 13. Having to use a smart device during the assessment was easier than using a paper form121044
 14. Absent students were easy to manage28218
 15. Late arrival students were easy to manage26319
 16. The grading options were appropriate101901
 17. Being able to input comments on student’s performance was useful101253
 18. I felt that I had to include a comment on students’ performance for each student47127
 19. I included a comment when I judged that a student had not completed an element of the skill appropriately313104
 20. Not being able to see the students overall score was good101055
 21. Being able to give a global rating for each student was good111045
 22. Not being able to see the score for each criterion was good8976
Section C:Overall OSCE Software user acceptanceExcellentGoodFairPoor
 23. Grade your overall acceptance of the application of the new electronic system20910
 24. Grade your overall students’ assessment using the new electronic system20910
 25. Overall level of satisfaction21810

Abbreviations: OSCE, Objective Structured Clinical Examination; COES, Computerized Web-based OSCE Evaluation System; QR, Quick Response code.

User Satisfaction Survey Results (N= 30) Abbreviations: OSCE, Objective Structured Clinical Examination; COES, Computerized Web-based OSCE Evaluation System; QR, Quick Response code. Almost all the assessors (29, 97%) were satisfied with the application of the COES to assess the medical students’ performance. Most examiners (18/25, 72%) indicated that the electronic system facilitated the evaluation of the students’ skills. Most examiners (22/26, 84%) found using a smart device (iPad) was easier than using a paper form; the remainder found the paper form easier (4/26, 16%). More than half of the examiners (16/26, 61.5%) felt it necessary to include a comment on students’ performance at the end of the assessment in the comments section. Over two-thirds of the examiners answered the questions related to the usage of the COES (Section A) positively with few exceptions. All examiners (28/28, 100%) expressed their preference for using the electronic system in the future.

Discussion

Overall, most of the assessors recorded good to excellent feedback on the User Satisfaction Survey. Our Cronbach’s alpha value is much higher than that reported by another similar Irish study (0.83 vs 0.14).16 This could be due to our larger number of examiners (30 vs 18) or because of a social desirability bias issue in that study. More than 95% of the examiners were satisfied with implementing the QR-coded COES to assess the fifth-year medical students. Such observation supports the findings of various studies that showed that the application of an electronic and/or online OSCE system was satisfactory.16,24–26 Furthermore, all our examiners agreed to use the same methodology, with some slight modifications, in the future. Suggested interventions included a better internet connection and adding a question assessing the student’s Global Rating Score. No examiner was required to manage network connection problems by themselves as we ensured the continuous presence and support of our technical team. Additionally, Wi-Fi connectivity was tested prior to the application of the online assessments. Multiple factors supported and augmented the new experience. These include the following: Preparatory training sessions; the system’s easy accessibility; clarity of the system’s layout and instructions; easy navigation of the multiple-choice scorings; and the availability of technical support. The newly tailored functionalities provided by the COES using QuestionPro enabled the examiners to easily evaluate, track, intervene, and comment on each student’s performance. To our knowledge, this work represents the first trial of a fully integrated QR-coded COES worldwide. A relatively recent Irish study by Meskell et al tracked a cohort of first-year nursing students over two consecutive years (n=203) using a “built-in” online OSCE management information system. It found that electronic software facilitated the analysis of overall results, thereby offering considerable time savings.16 Similarly, the application of our novel electronic system significantly shortened the time needed for the analysis of the results, allowing more time for data interpretation, better curriculum development, and clinical teaching improvements. As the survey was adopted from a previous work,16 we decided to limit our modifications to some of its items for comparison. For more clarification, items such as “Absent students were easy to manage” or “Late arrival students were easy to manage” could not be interpreted adequately as absentees and late arrivals were managed by the OSCE coordinators rather than the examiners themselves. Other survey items that assessed the COES were: “I felt that I had to include a comment on student performance for each student” or “I included a comment when I judged that a student had not completed an element of the skill appropriately.” These showed significant variability between the assessors; therefore, they were difficult to accurately interpret. However, to better suit the electronic system, we tried to anticipate and overcome all the obstacles that may affect students’ assessment. As the traditional paper-based method for OSCE assessment was easy to follow and allowed the examiners to add their comments on each student’s performance,26 the COES guaranteed this right by adding an optional “Comment section” functionality at the end of each student encounter. The use of QR-coding in OSCE assessment was not previously reported in the literature. A qualified e-learning team is required to deliver high-end technology for numerous simultaneous encounters. Approximately 695 QR-coded interactions between the students and their examiners were recorded during our COES. This requires good preparatory technical support as well as excellent cooperation between academic and technical teams. Our concept of fully digitizing the OSCE assessment process shortened the time needed for both the analysis of results and providing students with feedback. According to our preliminary exploration, we estimate that the new approach saved us more than 48 hours of data entry, data analysis, and final student grading. This study had its limitations. First, the relatively small number of stations (n=5) might play a significant role in the feasibility of applying such a fully electronic OSCE assessment system on a larger scale. However, we believe that having more than 650 overall student encounters with the new system without any major shortcomings demonstrated its safety and applicability. Second, the small number of examiners (n=30) could impact the User Satisfaction Survey results negatively, but the number of our examiners is much bigger than other relevant studies in the same field with better consistency values. In addition, not including the students’ opinion and satisfaction towards such new intervention has its own negative implication. Also, the relatively higher cost of applying such computer-based assessment approach makes it difficult to be feasible and convenient in all settings. Finally, the possibility of assessors becoming distracted by using the COES tool is logically high. However, we did not perceive any negative effects in this regard, apart from what has been mentioned in the User Satisfaction Survey results. Additional observational studies on examiners’ behaviors during the online OSCE methodology are required.

Conclusion

In conclusion, our COES for medical students showed promising results for the potential to transition from traditional paper-based OSCE assessment into a fully digitized and online system. Users of this novel digital online assessment tool demonstrated a high level of satisfaction. Preparatory training sessions, the system’s easy accessibility, clarity of the system’s layout and instructions, easy navigation of the multiple-choice scorings, as well as the availability and readiness of technical support are integral to the success of this approach. Further observational studies are needed to assess examiners’ behaviors when using this new methodology.
  22 in total

1.  Objective structured clinical examination: the assessment of choice.

Authors:  Marliyya Zayyan
Journal:  Oman Med J       Date:  2011-07

2.  An exploration of student midwives' experiences of the Objective Structured Clinical Examination assessment process.

Authors:  Maebh Barry; Maria Noonan; Carmel Bradshaw; Sylvia Murphy-Tighe
Journal:  Nurse Educ Today       Date:  2011-10-13       Impact factor: 3.442

3.  The usability of personal digital assistants (PDAs) for assessment of practical performance.

Authors:  Ina Treadwell
Journal:  Med Educ       Date:  2006-09       Impact factor: 6.251

4.  Teaching and assessing clinical skills: a competency-based programme in China.

Authors:  P L Stillman; Y Wang; Q Ouyang; S Zhang; Y Yang; W D Sawyer
Journal:  Med Educ       Date:  1997-01       Impact factor: 6.251

5.  Back to the future: An online OSCE Management Information System for nursing OSCEs.

Authors:  Pauline Meskell; Eimear Burke; Thomas J B Kropmans; Evelyn Byrne; Winny Setyonugroho; Kieran M Kennedy
Journal:  Nurse Educ Today       Date:  2015-06-26       Impact factor: 3.442

6.  Implementation of Electronic Objective Structured Clinical Examination Evaluation in a Nurse Practitioner Program.

Authors:  Janet D Luimes; Mary Ellen Labrecque
Journal:  J Nurs Educ       Date:  2018-08-01       Impact factor: 1.726

Review 7.  Is the OSCE a feasible tool to assess competencies in undergraduate medical education?

Authors:  Madalena Folque Patrício; Miguel Julião; Filipa Fareleira; António Vaz Carneiro
Journal:  Med Teach       Date:  2013-03-22       Impact factor: 3.650

8.  Is Technology Enhanced Learning Cost-effective to Improve Skills?: The Monash Objective Structured Clinical Examination Virtual Experience.

Authors:  Angelina S Lim; Shaun Wen Huey Lee
Journal:  Simul Healthc       Date:  2022-04-01       Impact factor: 1.929

9.  How well do second-year students learn physical diagnosis? Observational study of an Objective Structured Clinical Examination (OSCE).

Authors:  Claus Hamann; Kevin Volkan; Mary B Fishman; Ronald C Silvestri; Steven R Simon; Suzanne W Fletcher
Journal:  BMC Med Educ       Date:  2002-01-10       Impact factor: 2.463

10.  If at first you don't succeed … adoption of iPad marking for high-stakes assessments.

Authors:  Terry Judd; Anna Ryan; Eleanor Flynn; Geoff McColl
Journal:  Perspect Med Educ       Date:  2017-10
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.