Literature DB >> 12189634

Comparison of an aggregate scoring method with a consensus scoring method in a measure of clinical reasoning capacity.

Bernard Charlin1, Martin Desaulniers, Robert Gagnon, Daniel Blouin, Cees van der Vleuten.   

Abstract

BACKGROUND: Diversity of clinical reasoning paths of thought among experts is well known. Nevertheless, in written clinical reasoning assessment, the common practice is to ask experts to reach a consensus on each item and to assess students on a unique "good answer." PURPOSES: To explore the effects of taking the variability of experts answers into account in a method of clinical reasoning assessment based on authentic tasks: the Script Concordance Test.
METHODS: Two different methods were used to build answer keys. The first incorporated variability among a group of experts (criterion experts) through an aggregate scoring method. The second was made with the consensus obtained from the group of criterion experts for each answer. Scores obtained with the two methods by students and another group of experts (tested experts) were compared. The domain of assessment was gynecology-obstetric clinical knowledge. The sample consisted of 150 clerkship students and seven other experts (tested experts).
RESULTS: In a context of authentic tasks, experts' answers on items varied substantially. Amazingly, 59% of answers given individually by criterion group experts differed from the answer they provided when they were asked in a group to provide the "good answer" required from students. The aggregate scoring method showed several advantages and was more sensitive to detecting expertise.
CONCLUSIONS: The findings suggest that, in assessment of complex performance in ill-defined situations, the usual practice of asking experts to reach a consensus on each item reduces and hinders the detection of expertise. If these results are confirmed by other researches, this practice should be reconsidered.

Mesh:

Year:  2002        PMID: 12189634     DOI: 10.1207/S15328015TLM1403_3

Source DB:  PubMed          Journal:  Teach Learn Med        ISSN: 1040-1334            Impact factor:   2.414


  11 in total

1.  Brief report: beyond clinical experience: features of data collection and interpretation that contribute to diagnostic accuracy.

Authors:  Mathieu R Nendaz; Anne M Gut; Arnaud Perrier; Martine Louis-Simonet; Katherine Blondon-Choa; François R Herrmann; Alain F Junod; Nu V Vu
Journal:  J Gen Intern Med       Date:  2006-12       Impact factor: 5.128

2.  Online clinical reasoning assessment with the Script Concordance test: a feasibility study.

Authors:  Louis Sibert; Stefan J Darmoni; Badisse Dahamna; Jacques Weber; Bernard Charlin
Journal:  BMC Med Inform Decis Mak       Date:  2005-06-20       Impact factor: 2.796

3.  Evaluation of clinical reasoning in basic emergencies using a script concordance test.

Authors:  Caroline Boulouffe; Bernard Charlin; Dominique Vanpee
Journal:  Am J Pharm Educ       Date:  2010-12-15       Impact factor: 2.047

4.  On line clinical reasoning assessment with Script Concordance test in urology: results of a French pilot study.

Authors:  Louis Sibert; Stefan J Darmoni; Badisse Dahamna; Marie-France Hellot; Jacques Weber; Bernard Charlin
Journal:  BMC Med Educ       Date:  2006-08-28       Impact factor: 2.463

5.  Developing a viva exam to assess clinical reasoning in pre-registration osteopathy students.

Authors:  Paul Orrock; Sandra Grace; Brett Vaughan; Rosanne Coutts
Journal:  BMC Med Educ       Date:  2014-09-19       Impact factor: 2.463

6.  Validation of undergraduate medical student script concordance test (SCT) scores on the clinical assessment of the acute abdomen.

Authors:  Matthias Goos; Fabian Schubach; Gabriel Seifert; Martin Boeker
Journal:  BMC Surg       Date:  2016-08-17       Impact factor: 2.102

7.  Virtual patients in the acquisition of clinical reasoning skills: does presentation mode matter? A quasi-randomized controlled trial.

Authors:  Fabian Schubach; Matthias Goos; Götz Fabry; Werner Vach; Martin Boeker
Journal:  BMC Med Educ       Date:  2017-09-15       Impact factor: 2.463

8.  Constructing a question bank based on script concordance approach as a novel assessment methodology in surgical education.

Authors:  Salah A Aldekhayel; Nahar A Alselaim; Mohi Eldin Magzoub; Mohammad M Al-Qattan; Abdullah M Al-Namlah; Hani Tamim; Abdullah Al-Khayal; Sultan I Al-Habdan; Mohammed F Zamakhshary
Journal:  BMC Med Educ       Date:  2012-10-24       Impact factor: 2.463

9.  Understanding clinical reasoning in osteopathy: a qualitative research approach.

Authors:  Sandra Grace; Paul Orrock; Brett Vaughan; Raymond Blaich; Rosanne Coutts
Journal:  Chiropr Man Therap       Date:  2016-03-08

10.  Criterion scores, construct validity and reliability of a web-based instrument to assess physiotherapists' clinical reasoning focused on behaviour change: 'Reasoning 4 Change'.

Authors:  Maria Elvén; Jacek Hochwälder; Elizabeth Dean; Olle Hällman; Anne Söderlund
Journal:  AIMS Public Health       Date:  2018-07-06
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.