Literature DB >> 26450346

Student-written single-best answer questions predict performance in finals.

Jason Walsh1, Benjamin Harris2, Saadia Tayyaba1, David Harris3, Phil Smith4.   

Abstract

BACKGROUND: Single-best answer (SBA) questions are widely used for assessment in medical schools; however, often clinical staff have neither the time nor the incentive to develop high-quality material for revision purposes. A student-led approach to producing formative SBA questions offers a potential solution.
METHODS: Cardiff University School of Medicine students created a bank of SBA questions through a previously described staged approach, involving student question-writing, peer-review and targeted senior clinician input. We arranged questions into discrete tests and posted these online. Student volunteer performance on these tests from the 2012/13 cohort of final-year medical students was recorded and compared with the performance of these students in medical school finals (knowledge and objective structured clinical examinations, OSCEs). In addition, we compared the performance of students that participated in question-writing groups with the performance of the rest of the cohort on the summative SBA assessment. Often clinical staff have neither the time nor the incentive to develop high-quality material for revision purposes
RESULTS: Performance in the end-of-year summative clinical knowledge SBA paper correlated strongly with performance in the formative student-written SBA test (r = ~0.60, p <0.01). There was no significant correlation between summative OSCE scores and formative student-written SBA test scores. Students who wrote and reviewed questions scored higher than average in the end-of-year summative clinical knowledge SBA paper.
CONCLUSION: Student-written SBAs predict performance in end-of-year SBA examinations, and therefore can provide a potentially valuable revision resource. There is potential for student-written questions to be incorporated into summative examinations.
© 2015 John Wiley & Sons Ltd.

Mesh:

Year:  2015        PMID: 26450346     DOI: 10.1111/tct.12445

Source DB:  PubMed          Journal:  Clin Teach        ISSN: 1743-4971


  7 in total

1.  Admissions Is Not Enough: The Racial Achievement Gap in Medical Education.

Authors:  Alana C Jones; Alana C Nichols; Carmel M McNicholas; Fatima C Stanford
Journal:  Acad Med       Date:  2021-02-01       Impact factor: 7.840

2.  Question-Based Collaborative Learning for Constructive Curricular Alignment.

Authors:  Laura S Wynn-Lawrence; Laksha Bala; Rebekah J Fletcher; Rebecca K Wilson; Amir H Sam
Journal:  Adv Med Educ Pract       Date:  2021-01-05

3.  Does developing multiple-choice Questions Improve Medical Students' Learning? A Systematic Review.

Authors:  Youness Touissi; Ghita Hjiej; Abderrazak Hajjioui; Azeddine Ibrahimi; Maryam Fourtassi
Journal:  Med Educ Online       Date:  2022-12

4.  Assessment as Learning in Medical Education: Feasibility and Perceived Impact of Student-Generated Formative Assessments.

Authors:  Farah Otaki; Ritu Lakhtakia; Laila Alsuwaidi; Nabil Zary
Journal:  JMIR Med Educ       Date:  2022-07-22

5.  Formative student-authored question bank: perceptions, question quality and association with summative performance.

Authors:  Jason L Walsh; Benjamin H L Harris; Paul Denny; Phil Smith
Journal:  Postgrad Med J       Date:  2017-09-02       Impact factor: 2.401

6.  Evaluation of an Intervention to Improve Quality of Single-best Answer Multiple-choice Questions.

Authors:  Kevin R Scott; Andrew M King; Molly K Estes; Lauren W Conlon; Jonathan S Jones; Andrew W Phillips
Journal:  West J Emerg Med       Date:  2018-12-03

7.  Training Medical Students to Create and Collaboratively Review Multiple-Choice Questions: A Comprehensive Workshop.

Authors:  Josh Kurtz; Beth Holman; Seetha U Monrad
Journal:  MedEdPORTAL       Date:  2020-10-06
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.