Literature DB >> 8736188

A closer look at cueing effects in multiple-choice questions.

L W Schuwirth1, C P van der Vleuten, H H Donkers.   

Abstract

This study investigates the cueing effect occurring in multiple choice questions. Two parallel tests with matching contents were administered. By means of a computer program, examinees of different training levels and professional expertise were presented the same set of 35 cases (derived from patient problems in general practice) twice. The first time the cases were linked to open-ended questions; the second time they were linked to multiple choice questions. The examinees consisted of 75 medical students from three different years of training, 25 residents in training for general practice and 25 experienced general practitioners. Across groups, total test scores reflected a difference in mean scores on both formats, and a high inter-test correlation. Within each level of expertise, differences in mean scores and high correlations were also found. The data were further explored per group of examinees. Two types of cueing effects were found: positive cueing (examinees were cued towards the correct answer) and negative cueing (examinees were cued towards an incorrect answer). These effects were found at all levels of expertise and in almost all items. However, both effects decline with increasing level of expertise. Positive cueing mainly occurs in difficult items, whereas negative cueing mainly occurs in easy items.

Entities:  

Mesh:

Year:  1996        PMID: 8736188     DOI: 10.1111/j.1365-2923.1996.tb00716.x

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  12 in total

1.  Successful use of a competency step exam in a perfusion education program.

Authors:  Jeffrey B Riley; Philip D Beckley; Richard D Tallman; Allison S Spiwak
Journal:  J Extra Corpor Technol       Date:  2006-03

2.  Programmatic assessment of competency-based workplace learning: when theory meets practice.

Authors:  Harold G J Bok; Pim W Teunissen; Robert P Favier; Nancy J Rietbroek; Lars F H Theyse; Harold Brommer; Jan C M Haarhuis; Peter van Beukelen; Cees P M van der Vleuten; Debbie A D C Jaarsma
Journal:  BMC Med Educ       Date:  2013-09-11       Impact factor: 2.463

3.  Characterization of medical students recall of factual knowledge using learning objects and repeated testing in a novel e-learning system.

Authors:  Tiago Taveira-Gomes; Rui Prado-Costa; Milton Severo; Maria Amélia Ferreira
Journal:  BMC Med Educ       Date:  2015-01-24       Impact factor: 2.463

4.  Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study.

Authors:  Amir H Sam; Rachel Westacott; Mark Gurnell; Rebecca Wilson; Karim Meeran; Celia Brown
Journal:  BMJ Open       Date:  2019-09-26       Impact factor: 2.692

5.  Do you hear what you see? Utilizing phonocardiography to enhance proficiency in cardiac auscultation.

Authors:  Bjorn Watsjold; Jonathan Ilgen; Sandra Monteiro; Matthew Sibbald; Zachary D Goldberger; W Reid Thompson; Geoff Norman
Journal:  Perspect Med Educ       Date:  2021-01-12

6.  Assessment of higher order cognitive skills in undergraduate education: modified essay or multiple choice questions? Research paper.

Authors:  Edward J Palmer; Peter G Devitt
Journal:  BMC Med Educ       Date:  2007-11-28       Impact factor: 2.463

7.  Technical flaws in multiple-choice questions in the access exam to medical specialties ("examen MIR") in Spain (2009-2013).

Authors:  María Cristina Rodríguez-Díez; Manuel Alegre; Nieves Díez; Leire Arbea; Marta Ferrer
Journal:  BMC Med Educ       Date:  2016-02-03       Impact factor: 2.463

8.  Is a picture worth a thousand words: an analysis of the difficulty and discrimination parameters of illustrated vs. text-alone vignettes in histology multiple choice questions.

Authors:  Jane Holland; Robin O'Sullivan; Richard Arnett
Journal:  BMC Med Educ       Date:  2015-10-26       Impact factor: 2.463

9.  Validation and perception of a key feature problem examination in neurology.

Authors:  Meike Grumer; Peter Brüstle; Johann Lambeck; Silke Biller; Jochen Brich
Journal:  PLoS One       Date:  2019-10-18       Impact factor: 3.240

10.  Do different response formats affect how test takers approach a clinical reasoning task? An experimental study on antecedents of diagnostic accuracy using a constructed response and a selected response format.

Authors:  Stefan K Schauber; Stefanie C Hautz; Juliane E Kämmer; Fabian Stroben; Wolf E Hautz
Journal:  Adv Health Sci Educ Theory Pract       Date:  2021-05-11       Impact factor: 3.853

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.