| Literature DB >> 35984844 |
Nazdar Ezzaddin Alkhateeb1, Baderkhan Saeed Ahmed2, Namir Ghanim Al-Tawil3, Ali A Al-Dabbagh1.
Abstract
BACKGROUND: With the emergence of the COVID-19 pandemic and lockdown approach that was adopted all over the world, conducting assessments while maintaining integrity became a big challenge. This article aims at sharing the experience of conducting an online assessment with the academic community and to assess its effectiveness from both examiners' and students' perspectives.Entities:
Mesh:
Year: 2022 PMID: 35984844 PMCID: PMC9390930 DOI: 10.1371/journal.pone.0272927
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.752
Fig 1Examination panels setup.
All students who assigned #1 in the 18 groups will pass through the same set of questions, the same will be applied for students #2–8. Two extra set of questions were saved to overcome any problems.
Fig 2Examination process.
(A) Sequence of discipline questions asked by examination panel, (B) question sets according to assigned students numbers in each examination panel i.e. 8 set of questions used by each examination panel for the 8 students.
Mean satisfaction scores of the online assessment evaluation scales by gender and specialty.
| Students | Examiners | ||||||
|---|---|---|---|---|---|---|---|
| N | Mean score (SD) | P-value | N | Mean score (SD) | P-value | ||
|
|
| ||||||
| Male | 27 | 23.9 (6.9) | 0.392 | Male | 26 | 34.46 (6.5) | 0.095 |
| Female | 48 | 25.3 (6.7) | Female | 28 | 33.25 (5.7) | ||
|
| |||||||
| Medicine | 3 | 35.7 (0.6) | |||||
| Obstetrics and gynecology | 20 | 34.5 (5.2) | 0.392 | ||||
| Pediatric | 15 | 33.5 (6.1) | |||||
| Surgery | 16 | 33 (7.6) | |||||
*Mann-Whitney test
**Kruskal-Wallis test
Students’ perception about the assessment process.
| Disagree and Strongly Disagree | Neutral | Agree and Strongly Agree | |
|---|---|---|---|
| Indicators | No. (%) | No. (%) | No. (%) |
| Students comprehensive instructions on the way of the online assessment. | 22 (29.3) | 20 (26.7) | 33 (44.0) |
| There was sufficient communication between the school administration and students. | 20 (26.7) | 26 (34.7) | 29 (38.7) |
| The format of the online assessment was acceptable. | 23 (30.7) | 25 (33.3) | 27 (36.0) |
| The examiners were professional. | 15 (20.0) | 22 (29.3) | 38 (50.7) |
| The assessment questions were clear. | 18 (24.0) | 19 (25.3) | 38 (50.7) |
| The conducted assessment evaluated students’ clinical skills. | 40 (53.3) | 27 (36.0) | 8 (10.7) |
| The time allocated to answer each (medicine, surgery, OBGYN, pediatrics) was enough. | 17 (22.7) | 18 (24.0) | 40 (53.3) |
| Overall, you were satisfied with the assessment process. | 23 (30.7) | 24 (32.0) | 28 (37.3) |
Areas that the students like (Pros) and dislike (Cons) regarding the assessment process.
| Themes | Subthemes | No. of responses | (%) N = 75 | |
|---|---|---|---|---|
|
| ||||
| Questions and format of assessment | • The questions were practical and reasonable covering common clinical and emergency problems. | 26 | (34.6) | |
| • Students liked the online assessment as a new and safe solution. | 11 | (14.7) | ||
| Examiners behavior | • The examiners behaved professionally. | 18 | (24.0) | |
|
| 18 | (24.0) | ||
|
| ||||
| Administration issues | • Long waiting time and delay | 17 | (22.7) | |
| • Connection problem | 15 | (20.0) | ||
| • Poor communication with the examiners (voice and face were not clear). | 6 | (8.0) | ||
| • Unfair distribution of questions and marks as only 20% was dedicated for the final exam | 17 | (22.7) | ||
| • The online assessment process | 7 | (9.3) | ||
| • Preparedness and organization for the assessment process | 2 | (2.7) | ||
| Questions and format of the assessment | • The questions were not clear, and many of them are not emergency cases. | 9 | (12.0) | |
| • Clinical skills cannot be assessed | 5 | (6.7) | ||
| • Short time allocated for the assessment process | 5 | (6.7) | ||
| Nothing specific | 6 | (8.0) | ||
*Students may have more than one disliked area.
Students’ suggestions for improvement.
| No. | (%) | |
|---|---|---|
| No specific suggestion | 18 | (24.00) |
| To find a method of assessment, other than the online | 13 | (17.33) |
| Better training for the students and examiners before the exam | 10 | (13.33) |
| Accurate timing, and to give more time | 10 | (13.33) |
| Questions should focus on emergency conditions | 8 | (10.67) |
| Same closed-ended questions for all the students, more questions, and more examiners | 8 | (10.67) |
| Better internet and website | 5 | (6.67) |
| Others | 3 | (4.00) |
| Total | 75 | (100.00) |
Examiners’ perception about the examination process.
| Disagree and Strongly Disagree | Neutral | Agree and Strongly Agree | |
|---|---|---|---|
| The items | No. (%) | No. (%) | No. (%) |
| You were clearly informed about the assessment process | 3 (5.6) | 11 (20.4) | 40 (74.1) |
| During the assessment, the technical support by the cohost was adequate | 3 (5.6) | 12 (22.2) | 39 (72.2) |
| During the assessment, health regulation, physical distancing, and mask wearing on the day of the exam (at college) was adequate. | 6 (11.1) | 11 (20.4) | 37 (68.5) |
| Exam duration was acceptable. | 1 (1.9) | 7 (13.0) | 46 (85.2) |
| Assessment reflected real clinical practice. | 11 (20.4) | 28 (51.9) | 15 (27.8) |
| The question reflected proper sampling from the curriculum | 2 (3.7) | 18 (33.3) | 34 (63.0) |
| You were satisfied with the provided assessment checklist | 8 (14.8) | 9 (16.7) | 37 (68.5) |
| The online assessment could assess clinical competence. | 17 (31.5) | 14 (25.9) | 23 (42.6) |
| Organization of the whole process met your expectation | 3 (5.6) | 12 (22.2) | 39 (72.2) |
Areas that the examiners like (Pros) and dislike (Cons) regarding the assessment process.
| Themes | Subthemes | No. of responses | (%) | |
|---|---|---|---|---|
|
| ||||
|
| • Organization of the examination process | 21 | (38.9) | |
| • Teamwork | 15 | (27.8) | ||
| • The time of the exam was known for the students | 3 | (5.6) | ||
| • A new experience in assessing the students | 1 | (1.9) | ||
|
| • Fair | 4 | (7.4) | |
| • The questions were clinical, covering the curriculum | 4 | (7.4) | ||
|
| • Being online, so no chance to get the COVID-19 | 3 | (5.6) | |
|
| 3 | (5.6) | ||
|
| ||||
|
| • Poor organization | 8 | (14.8) | |
|
| • Connection problems | 10 | (18.5) | |
| • No social distancing | 1 | (1.9) | ||
|
| • It did not reflect the clinical skills of the students | 12 | (22.2) | |
| • Direct questions that can’t differentiate between students | 3 | (5.6) | ||
| • Absence of a detailed checklist for each case | 4 | (7.4) | ||
| • The appearance of the answer key for some questions | 3 | (5.6) | ||
|
| • Not every examiner was involved in preparing the questions | 2 | (3.7) | |
| • Delay of the examiners in attending on time | 1 | (1.9) | ||
|
| 10 | (18.5) | ||
|
| 54 | (100) | ||
Examiners’ suggestions for improvement.
| Suggestions | No. | (%) |
|---|---|---|
| Better organization, better preparation for the exam, and more questions. | 20 | (36.4) |
| Training of the examiners before the exam | 9 | (16.4) |
| On-campus assessment with the use of safety measures | 8 | (14.5) |
| Nothing specific | 7 | (12.7) |
| Ongoing assessment should have a role in the assessment process | 3 | (5.5) |
| Better internet connection | 2 | (3.6) |
| Involve more examiners in the process | 2 | (3.6) |
| Use of a detailed checklist for one of the departments | 1 | (1.8) |
| Use the same process of assessment in the future | 1 | (1.8) |
| Assessment by an individual examiner (not in committee) | 1 | (1.8) |
| Share the exam experience with other colleges | 1 | (1.8) |
| Total | 55 | (100.0) |
Note. *More than one suggestion is possible for each examiner.