Literature DB >> 29793702

Overcoming pitfalls: Results from a mandatory peer review process for written examinations.

Kyle John Wilby1, Maguy S El Hajj2, Marwa El-Bashir3, Fatima Mraiche4.   

Abstract

PROBLEM DESCRIPTION: Written assessments are essential components of higher education practices. However, faculty members encounter common pitfalls when designing questions intended to evaluate student-learning outcomes. The objective of this project was to determine the impact of a mandatory examination peer review process on question accuracy, alignment with learning objectives, use of best practices in question design, and language/grammar. QUALITY IMPROVEMENT
METHODS: A mandatory peer review process was implemented for all midterm (before phase) and final (after phase) examinations. Peer review occurred by two reviewers and followed a pre-defined guidance document. Non-punitive feedback given to faculty members served as the intervention. Frequencies of flagged questions according to guidance categories were compared between phases. RESULTS OF CQI INQUIRY: A total of 21 midterm and 21 final exam reviews were included in the analysis. A total of 637 questions were reviewed across all midterms and 1003 questions were reviewed across all finals. Few questions were flagged for accuracy and alignment with learning outcomes. The median total proportion of questions flagged for best practices was significantly lower for final exams versus midterm exams (15.8 vs. 6.45%, p = 0.014). The intervention did not influence language and grammar errors (9.68 vs. 10.0% of questions flagged before and after, respectively, p = 0.305).
CONCLUSIONS: A non-punitive peer review process for written examinations can overcome pitfalls in exam creation and improve best practices in question writing. The peer-review process had a substantial effect at flagging language/grammar errors but error rate did not differ between midterm and final exams.
Copyright © 2017 Elsevier Inc. All rights reserved.

Keywords:  Assessment; Examination; Peer-review; Pharmacy; Quality

Mesh:

Year:  2018        PMID: 29793702     DOI: 10.1016/j.cptl.2017.12.015

Source DB:  PubMed          Journal:  Curr Pharm Teach Learn        ISSN: 1877-1297


  1 in total

1.  Differences in medical student performance on examinations: exploring score variance between Kolb's Learning Style Inventory classifications.

Authors:  Quentin J Reynolds; Kurt O Gilliland; Katie Smith; Joshua A Walker; Gary L Beck Dallaghan
Journal:  BMC Med Educ       Date:  2020-11-11       Impact factor: 2.463

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.