Literature DB >> 28758233

Exploring examinee behaviours as validity evidence for multiple-choice question examinations.

Luke T Surry1, Dario Torre1, Steven J Durning1.   

Abstract

CONTEXT: Clinical-vignette multiple choice question (MCQ) examinations are used widely in medical education. Standardised MCQ examinations are used by licensure and certification bodies to award credentials that are meant to assure stakeholders as to the quality of physicians. Such uses are based on the interpretation of MCQ examination performance as giving meaningful information about the quality of clinical reasoning. There are several assumptions foundational to these interpretations and uses of standardised MCQ examinations. This study explores the implicit assumption that cognitive processes elicited by clinical-vignette MCQ items are like the processes thought to occur with 'real-world' clinical reasoning as theorised by dual-process theory.
METHODS: Fourteen participants (three medical students, five residents and six staff physicians) completed three sets of five timed MCQ items (total 15) from the Medical Knowledge Self-Assessment Program (MKSAP). Upon answering a set of MCQs, each participant completed a retrospective think aloud (TA) protocol. Using constant comparative analysis (CCA) methods sensitised by dual-process theory, we performed a qualitative thematic analysis.
RESULTS: Examinee behaviours fell into three categories: clinical reasoning behaviours, test-taking behaviours and reactions to the MCQ. Consistent with dual-process theory, statements about clinical reasoning behaviours were divided into two sub-categories: analytical reasoning and non-analytical reasoning. Each of these categories included several themes.
CONCLUSIONS: Our study provides some validity evidence that test-takers' descriptions of their cognitive processes during completion of high-quality clinical-vignette MCQs align with processes expected in real-world clinical reasoning. This supports one of the assumptions important for interpretations of MCQ examination scores as meaningful measures of clinical reasoning. Our observations also suggest that MCQs elicit other cognitive processes, including certain test-taking behaviours, that seem 'inauthentic' to real-world clinical reasoning. Further research is needed to explore if similar themes arise in other contexts (e.g. simulated patient encounters) and how observed behaviours relate to performance on MCQ-based assessments. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

Entities:  

Mesh:

Year:  2017        PMID: 28758233     DOI: 10.1111/medu.13367

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  10 in total

Review 1.  A review of web-based application of online learning in pathology and laboratory medicine.

Authors:  Cullen D Smith; Neel Atawala; Carolyn A Klatt; Edward C Klatt
Journal:  J Pathol Inform       Date:  2022-08-09

2.  Examining Bloom's Taxonomy in Multiple Choice Questions: Students' Approach to Questions.

Authors:  J K Stringer; Sally A Santen; Eun Lee; Meagan Rawls; Jean Bailey; Alicia Richards; Robert A Perera; Diane Biskobing
Journal:  Med Sci Educ       Date:  2021-05-25

3.  A mixed-methods exploration of cognitive dispositions to respond and clinical reasoning errors with multiple choice questions.

Authors:  Luke T Surry; Dario Torre; Robert L Trowbridge; Steven J Durning
Journal:  BMC Med Educ       Date:  2018-11-23       Impact factor: 2.463

4.  Clinical Reasoning in the Primary Care Setting: Two Scenario-Based Simulations for Residents and Attendings.

Authors:  Alexis Battista; Abigail Konopasky; Divya Ramani; Megan Ohmer; Jeffrey Mikita; Anna Howle; Sarah Krajnik; Dario Torre; Steven J Durning
Journal:  MedEdPORTAL       Date:  2018-11-16

5.  Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: Cross-sectional study.

Authors:  Amir H Sam; Rachel Westacott; Mark Gurnell; Rebecca Wilson; Karim Meeran; Celia Brown
Journal:  BMJ Open       Date:  2019-09-26       Impact factor: 2.692

6.  Very Short Answer Questions: A Novel Approach To Summative Assessments In Pathology.

Authors:  Amir H Sam; Emilia Peleva; Chee Yeen Fung; Nicki Cohen; Emyr W Benbow; Karim Meeran
Journal:  Adv Med Educ Pract       Date:  2019-11-04

7.  Does burnout affect clinical reasoning? An observational study among residents in general practice.

Authors:  Philippe Guillou; Thierry Pelaccia; Marie-Frédérique Bacqué; Mathieu Lorenzo
Journal:  BMC Med Educ       Date:  2021-01-07       Impact factor: 2.463

8.  Providing a model for validation of the assessment system of internal medicine residents based on Kane's framework.

Authors:  Mostafa Dehghani Poudeh; Aeen Mohammadi; Rita Mojtahedzadeh; Nikoo Yamani; Ali Delavar
Journal:  J Educ Health Promot       Date:  2021-10-29

9.  Using very short answer errors to guide teaching.

Authors:  Oliver Putt; Rachel Westacott; Amir H Sam; Mark Gurnell; Celia A Brown
Journal:  Clin Teach       Date:  2022-01-25

10.  Transforming MRCPsych theory examinations: digitisation and very short answer questions (VSAQs).

Authors:  Karl Scheeres; Niruj Agrawal; Stephanie Ewen; Ian Hall
Journal:  BJPsych Bull       Date:  2022-02
  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.