| Literature DB >> 33752773 |
Karl Scheeres1, Niruj Agrawal2,3, Stephanie Ewen4, Ian Hall5.
Abstract
Many examinations are now delivered online using digital formats, the migration to which has been accelerated by the COVID-19 pandemic. The MRCPsych theory examinations have been delivered in this way since Autumn 2020. The multiple choice question formats currently in use are highly reliable, but other formats enabled by the digital platform, such as very short answer questions (VSAQs), may promote deeper learning. Trainees often ask for a focus on core knowledge, and the absence of cueing with VSAQs could help achieve this. This paper describes the background and evidence base for VSAQs, and how they might be introduced. Any new question formats would be thoroughly piloted before appearing in the examinations and are likely to have a phased introduction alongside existing formats.Entities:
Keywords: Education and training; cost-effectiveness; history of psychiatry; information technologies; supervision
Year: 2022 PMID: 33752773 PMCID: PMC8914921 DOI: 10.1192/bjb.2021.23
Source DB: PubMed Journal: BJPsych Bull ISSN: 2056-4694
Key factors to be considered when assessing the utility of an assessment (adapted with permission from reference[2])
| Factor | Questions asked |
|---|---|
| Validity | Does the examination test what we want it to test? |
| Reliability | Are the results repeatable and accurate? Are external sources of error other than candidate ability accounted for and reduced? |
| Educational impact | What is the impact of the examination upon trainees’ learning? Does it lead to deeper learning and long-term retention? |
| Acceptability | Is the examination acceptable to those sitting it and other stakeholders? |
| Cost | Are costs reasonable? |
Common themes of trainees’ concerns and responses
| Concern | Reponses |
|---|---|
| Technical issues, e.g. internet connectivity | The College partners with third-party software providers who have both expertise and a track record in high-stakes online examination delivery. Trainees are encouraged to test the resilience of their internet and device in advance, using provided software. Software developers design software to account for brief interruptions, and protocols exist for more significant technical issues. |
| Cheating, proctoring and false accusations | All alerts from the artificial intelligence software proctoring are reviewed by a live proctor. Final decisions about cheating are made following rigorous review by the Examinations Sub-committee, and subject to the normal appeals process. |
| Unsuitable home environment | Candidates can choose any suitable workstation with reliable internet to take the examination, e.g. a family member's or friend's house, a work or university computer. |
| Examination should not be reduced to a ‘spelling test’ in very short answer questions | Variations in answers and spelling mistakes will be accounted for, and examiners would review incorrect answers, including typos and spelling errors. |