| Literature DB >> 35048334 |
Manish Sreenivasa1, Lucy Armitage1, Winson C C Lee2.
Abstract
The COVID-19 pandemic has caused a shift from on-campus to remote online examinations, which are usually difficult to invigilate. Meanwhile, closed-ended question formats, such as true-false (TF), are particularly suited to these examination conditions, as they allow automatic marking by computer software. While previous studies have reported the score characteristics in TF questions in conventional supervised examinations, this study investigates the efficacy of using TF questions in online, unsupervised examinations at the undergraduate level of Biomedical Engineering. We examine the TF and other question-type scores of 57 students across three examinations held in 2020 under online, unsupervised conditions. Our analysis shows significantly larger coefficient of variance (CV) in scores in TF questions (42.7%) than other question types (22.3%). The high CV in TF questions may be explained by different answering strategies among students, with 13.3 ± 17.2% of TF questions left unanswered (zero marks) and 16.4 ± 11.5% of TF questions guessed incorrectly (negative marks awarded). In unsupervised, open-book examination where sharing of answers among students is a potential risk; questions that induce a larger variation in responses may be desirable to differentiate among students. We also observed a significant relationship (r = 0.64, p < 0.05) between TF scores and the overall subject scores, indicating that TF questions are an effective predictor of overall student performance. Our results from this initial analysis suggests that TF questions are useful for assessing biomedical-theme content in online, unsupervised examinations, and are encouraging for their ongoing use in future assessments.Entities:
Keywords: Closed-ended questions; Proctoring; Remote learning; Student outcomes
Mesh:
Year: 2022 PMID: 35048334 PMCID: PMC8769792 DOI: 10.1007/s13246-021-01088-x
Source DB: PubMed Journal: Phys Eng Sci Med ISSN: 2662-4729
Fig. 1Sample TF question with a figure from the BMEG303 examination
Details of the three subjects and the question types within each examination
| Subject codes | Question types and percentage of total examination scores (in brackets) | Examination durations (h) | |
|---|---|---|---|
| BMEG201 | 10 True–false questions (25%) | 10 MC* questions (25%), 5 Free-response questions (50%) | 2 |
| BMEG301 | 10 True–false questions (20%) | 6 Free-response questions (80%) | 2 |
| BMEG303 | 20 True–false questions (25%) | 5 Free-response questions (75%) | 3 |
BMEG201 was a 2nd year subject with the other two being 3rd year biomedical engineering subjects
*Multiple-choice (MC) questions which involved 4 choices for each question
Score characteristics in each academic subject
| Academic subjects | True–false questions | Other question types | Significant differencesa? | Correlation coefficientb | ||
|---|---|---|---|---|---|---|
| Percentage scores (%) | CV (%) | Percentage scores (%) | CV (%) | |||
| BMEG201 | 53.3 ± 24.6 | 46.1 | 71.2 ± 12.3 | 17.2 | Yes, p < 0.05 | 0.68* |
| BMEG301 | 66.9 ± 19.3 | 28.9 | 62.1 ± 15.2 | 24.4 | No, p > 0.05 | 0.7* |
| BMEG303 | 46.0 ± 20.2 | 44.0 | 64.5 ± 11.0 | 17.1 | Yes, p < 0.05 | 0.37 |
| All three subjects | 53.9 ± 23.0 | 42.7 | 63.6 ± 14.2 | 22.3 | Yes, p < 0.05 | 0.64* |
*Indicates a significant relationship against p < 0.05
aDifferences in percentage scores between true–false questions and other question types
bRelationship between success rates in true–false questions and overall subject scores
Percentages of TF questions with wrong answers and those left unanswered
| Academic subjects | % questions with wrong answers | % unanswered questions | Significant differences? |
|---|---|---|---|
| BMEG201 | 17.7 ± 12.6 | 12.3 ± 18.7 | No, p > 0.05 |
| BMEG301 | 10.8 ± 9.5 | 11.5 ± 14.6 | No, p > 0.05 |
| BMEG303 | 18.5 ± 10.4 | 17.0 ± 17.7 | No, p > 0.05 |
| All three subjects | 16.4 ± 11.5 | 13.3 ± 17.2 | No, p > 0.05 |