| Literature DB >> 35140510 |
Abdullah A Yousef1,2, Bassam H Awary1, Faisal O AlQurashi1, Waleed H Albuali1, Mohammad H Al-Qahtani1, Syed I Husain2, Omair Sharif2.
Abstract
PURPOSE: The Objective Structured Clinical Examination (OSCE) is a standard academic assessment tool in the field of medical education. This study presents an innovative method for digitizing OSCE evaluation system for medical students and explores its efficacy compared to the traditional paper-based system, through the analysis of a User Satisfaction Survey.Entities:
Keywords: Saudi Arabia; academic performance; clinical competency; clinical skills; medical education; undergraduate
Year: 2022 PMID: 35140510 PMCID: PMC8820456 DOI: 10.2147/IJGM.S351052
Source DB: PubMed Journal: Int J Gen Med ISSN: 1178-7074
Figure 1Software Question Logic page which shows the Question (at the top), and Answer options with their score values.
Figure 2Flowchart for OSCE Exam using QuestionPro which starts from Scanning the student’s QR code and ends at submitting the electronic assessment form.
Figure 3Difference between QuestionPro and other online OSCE software. Probability of human errors may happen at the stage of logging into examiner’s page, selecting the student from a dropdown list, or absence of cross checking the student’s details prior to submitting the electronic assessment form.
Figure 4User Interface Introduction page which shows the student’s details, circuit and rotation number, and the names of the Examiners.
Figure 5An example of the OSCE assessment page (Questionnaire sheet).
User Satisfaction Survey Results (N= 30)
| 1. The training sessions prepared me sufficiently to use the system | 20 | 9 | 0 | 1 |
| 2. It was easy to access the computer system | 21 | 9 | 0 | 0 |
| 3. The screen layout and instructions were clear | 23 | 7 | 0 | 0 |
| 4. The computer system was easy to use | 23 | 7 | 0 | 0 |
| 5. I felt confident using the computer system | 22 | 8 | 0 | 0 |
| 6. Multiple-choice markings were easy to navigate | 18 | 9 | 0 | 3 |
| 7. There was good technical support provided to help me with any problems | 26 | 4 | 0 | 0 |
| 8. Using the system enabled me to complete my work as an examiner efficiently | 21 | 7 | 1 | 1 |
| 9. It was easy to exit from the program | 12 | 15 | 1 | 2 |
| 10. I would like to use the electronic system for future OSCEs | 21 | 7 | 0 | 2 |
| 11. Using the computer system facilitated the assessment of the students’ skill | 9 | 9 | 7 | 5 |
| 12. The outlined steps of the skill were appropriate to the skill being assessed | 12 | 18 | 0 | 0 |
| 13. Having to use a smart device during the assessment was easier than using a paper form | 12 | 10 | 4 | 4 |
| 14. Absent students were easy to manage | 2 | 8 | 2 | 18 |
| 15. Late arrival students were easy to manage | 2 | 6 | 3 | 19 |
| 16. The grading options were appropriate | 10 | 19 | 0 | 1 |
| 17. Being able to input comments on student’s performance was useful | 10 | 12 | 5 | 3 |
| 18. I felt that I had to include a comment on students’ performance for each student | 4 | 7 | 12 | 7 |
| 19. I included a comment when I judged that a student had not completed an element of the skill appropriately | 3 | 13 | 10 | 4 |
| 20. Not being able to see the students overall score was good | 10 | 10 | 5 | 5 |
| 21. Being able to give a global rating for each student was good | 11 | 10 | 4 | 5 |
| 22. Not being able to see the score for each criterion was good | 8 | 9 | 7 | 6 |
| 23. Grade your overall acceptance of the application of the new electronic system | 20 | 9 | 1 | 0 |
| 24. Grade your overall students’ assessment using the new electronic system | 20 | 9 | 1 | 0 |
| 25. Overall level of satisfaction | 21 | 8 | 1 | 0 |
Abbreviations: OSCE, Objective Structured Clinical Examination; COES, Computerized Web-based OSCE Evaluation System; QR, Quick Response code.