Literature DB >> 21038082

Pick-N multiple choice-exams: a comparison of scoring algorithms.

Daniel Bauer1, Matthias Holzer, Veronika Kopp, Martin R Fischer.   

Abstract

To compare different scoring algorithms for Pick-N multiple correct answer multiple-choice (MC) exams regarding test reliability, student performance, total item discrimination and item difficulty. Data from six 3rd year medical students' end of term exams in internal medicine from 2005 to 2008 at Munich University were analysed (1,255 students, 180 Pick-N items in total). Scoring Algorithms: Each question scored a maximum of one point. We compared: (a) Dichotomous scoring (DS): One point if all true and no wrong answers were chosen. (b) Partial credit algorithm 1 (PS(50)): One point for 100% true answers; 0.5 points for 50% or more true answers; zero points for less than 50% true answers. No point deduction for wrong choices. (c) Partial credit algorithm 2 (PS(1/m)): A fraction of one point depending on the total number of true answers was given for each correct answer identified. No point deduction for wrong choices. Application of partial crediting resulted in psychometric results superior to dichotomous scoring (DS). Algorithms examined resulted in similar psychometric data with PS(50) only slightly exceeding PS(1/m) in higher coefficients of reliability. The Pick-N MC format and its scoring using the PS(50) and PS(1/m) algorithms are suited for undergraduate medical examinations. Partial knowledge should be awarded in Pick-N MC exams.

Entities:  

Mesh:

Year:  2010        PMID: 21038082     DOI: 10.1007/s10459-010-9256-1

Source DB:  PubMed          Journal:  Adv Health Sci Educ Theory Pract        ISSN: 1382-4996            Impact factor:   3.853


  8 in total

1.  Evaluating Different Scoring Methods for Multiple Response Items Providing Partial Credit.

Authors:  Joe Betts; William Muntean; Doyoung Kim; Shu-Chuan Kao
Journal:  Educ Psychol Meas       Date:  2021-02-22       Impact factor: 2.821

2.  National Board of Medical Examiners and Curriculum Change: What Do Scores Tell Us? A Case Study at the University of Balamand Medical School.

Authors:  Mode Al Ojaimi; Megan Khairallah; Rayya Younes; Sara Salloum; Ghania Zgheib
Journal:  J Med Educ Curric Dev       Date:  2020-07-24

3.  Standardizing assessment practices of undergraduate medical competencies across medical schools: challenges, opportunities and lessons learned from a consortium of medical schools in Uganda.

Authors:  Aloysius Gonzaga Mubuuke; Catherine Mwesigwa; Samuel Maling; Godfrey Rukundo; Mike Kagawa; David Lagoro Kitara; Sarah Kiguli
Journal:  Pan Afr Med J       Date:  2014-12-16

4.  Impact of different scoring algorithms applied to multiple-mark survey items on outcome assessment: an in-field study on health-related knowledge.

Authors:  A Domnich; D Panatto; L Arata; I Bevilacqua; L Apprato; R Gasparini; D Amicizia
Journal:  J Prev Med Hyg       Date:  2015

5.  How to assess? Perceptions and preferences of undergraduate medical students concerning traditional assessment methods.

Authors:  Anita Holzinger; Stefan Lettner; Verena Steiner-Hofbauer; Meskuere Capan Melser
Journal:  BMC Med Educ       Date:  2020-09-17       Impact factor: 2.463

6.  Assessing declarative and procedural knowledge using multiple-choice questions.

Authors:  Ahmed Abu-Zaid; Tehreem A Khan
Journal:  Med Educ Online       Date:  2013-05-22

7.  Assessment in undergraduate medical education: a review of course exams.

Authors:  Allison A Vanderbilt; Moshe Feldman; Isaac K Wood
Journal:  Med Educ Online       Date:  2013-03-06

8.  Effectiveness of longitudinal faculty development programs on MCQs items writing skills: A follow-up study.

Authors:  Hamza Mohammad Abdulghani; Mohammad Irshad; Shafiul Haque; Tauseef Ahmad; Kamran Sattar; Mahmoud Salah Khalil
Journal:  PLoS One       Date:  2017-10-10       Impact factor: 3.240

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.