Literature DB >> 30314537

A psychometric analysis of a newly developed summative, multiple choice question assessment adapted from Canada to a Middle Eastern context.

Shane Ashley Pawluk1, Kieran Shah2, Rajwant Minhas3, Daniel Rainkie4, Kyle John Wilby5.   

Abstract

INTRODUCTION: Accreditation necessitates that assessment methods reflect the standards established by the accrediting body. The process of adapting assessments to a new context can present unique challenges with uncertainty around psychometric defensibility of the adapted exam.
METHODS: A psychometric analysis of a summative multiple-choice-question (MCQ) assessment, adapted from Canada, for graduating pharmacy students from a Canadian accredited program in Qatar was conducted. Rates of difficult items, item discrimination measured by point biserial correlation (rpb), and non-functioning distractors (NFDs) were calculated to identify deficiencies and challenges with an adapted exam. Challenges encountered throughout the adaption process and recommendations were documented.
RESULTS: Overall score of a 90-item, four option, MCQ exam ranged from 46.7% to 78.9% (mean of 61.9%). For difficulty, there were 17 items with less than 30% of students answering correctly, while 29 items had unacceptable or poor discrimination (rpb < 0.1). NFDs occurred in 78 items with 49 containing at least two NFDs. DISCUSSION AND
CONCLUSIONS: Difficulty of the exam was deemed acceptable yet discriminator ability requires improvement. The high frequency of questions with NFDs suggests that faculty have difficulty developing plausible distractors for an adapted MCQ exam. This could be due to a lack of training or requirement for inclusion of too many distractor options. While it is feasible to implement an assessment adapted from a different learning environment, measures need to be taken to improve psychometric defensibility. The high number of questions with NFDs indicates that the current method of exam development does not encourage the incorporation of functional distractors.
Copyright © 2018 Elsevier Inc. All rights reserved.

Keywords:  Assessment; Multiple choice; Psychometric; Student performance; Test performance

Mesh:

Year:  2018        PMID: 30314537     DOI: 10.1016/j.cptl.2018.05.003

Source DB:  PubMed          Journal:  Curr Pharm Teach Learn        ISSN: 1877-1297


  1 in total

1.  Quality of multiple-choice questions in medical internship qualification examination determined by item response theory at Debre Tabor University, Ethiopia.

Authors:  Lalem Menber Belay; Tegbar Yigzaw Sendekie; Fantu Abebe Eyowas
Journal:  BMC Med Educ       Date:  2022-08-22       Impact factor: 3.263

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.