Literature DB >> 470647

A comparison of multiple-choice tests and free-response tests in examinations of clinical competence.

D I Newble, A Baxter, R G Elmslie.   

Abstract

This paper reports a study which compared the performance of different groups of students and doctors on identical and equivalent tests set in an objective-type format and in a free-response format. The tests were designed to ensure that the content was relevant to clinical practice at the hospital intern level. In all test situations candidates' scores were significantly higher in the objective tests than in the free-response tests. This difference was greater for the more junior and less competent students than for the more competent doctors. The cueing effect of the options was thought to be the main factor responsible for the difference in performance. The results of a questionnaire survey demonstrated that students were aware of the deficiencies in multiple-choice tests. A large majority of the students believed that the free-response tests gave a more accurate assessment of their clinical ability. It was found that in these tests, aimed at measuring aspects of clinical competence, multiple-choice questions appeared to overestimate the candidate's ability to an extent that made them less suitable than free-response questions for this purpose. It was also found that free-response tests, of the type used in this study, provide a suitable alternative to multiple-choice tests for use in the written section of clinical examinations. It was concluded that the written component of the final examination in the medical course should have a preponderance of free-response items over multiple-choice items.

Mesh:

Year:  1979        PMID: 470647     DOI: 10.1111/j.1365-2923.1979.tb01511.x

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  7 in total

1.  ORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACTICE EXAMINATION PROGRAM: Planning, Structure, Startup, Administration, Growth and Evaluation.

Authors:  A Schubert; J Tetzlaff; M Licina; E Mascha; M P Smith
Journal:  J Educ Perioper Med       Date:  1999-05-01

2.  The assessment of professional competence: Developments, research and practical implications.

Authors:  C P Van Der Vleuten
Journal:  Adv Health Sci Educ Theory Pract       Date:  1996-01       Impact factor: 3.853

3.  The impact of two multiple-choice question formats on the problem-solving strategies used by novices and experts.

Authors:  Sylvain P Coderre; Peter Harasym; Henry Mandin; Gordon Fick
Journal:  BMC Med Educ       Date:  2004-11-05       Impact factor: 2.463

4.  The Impact of item flaws, testing at low cognitive level, and low distractor functioning on multiple-choice question quality.

Authors:  Syed Haris Ali; Kenneth G Ruit
Journal:  Perspect Med Educ       Date:  2015-10

5.  How to teach psychiatry to medical undergraduates in India?: a model.

Authors:  S M Manohari; Pradeep R Johnson; Ravindra Baburao Galgali
Journal:  Indian J Psychol Med       Date:  2013-01

6.  Impact of different scoring algorithms applied to multiple-mark survey items on outcome assessment: an in-field study on health-related knowledge.

Authors:  A Domnich; D Panatto; L Arata; I Bevilacqua; L Apprato; R Gasparini; D Amicizia
Journal:  J Prev Med Hyg       Date:  2015

7.  Does the Concept of the "Flipped Classroom" Extend to the Emergency Medicine Clinical Clerkship?

Authors:  Corey Heitz; Melanie Prusakowski; George Willis; Christopher Franck
Journal:  West J Emerg Med       Date:  2015-10-22
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.