Literature DB >> 20872770

Poorly performing physicians: does the Script Concordance Test detect bad clinical reasoning?

François Goulet1, André Jacques, Robert Gagnon, Bernard Charlin, Abdo Shabah.   

Abstract

INTRODUCTION: Evaluation of poorly performing physicians is a worldwide concern for licensing bodies. The Collège des Médecins du Québec currently assesses the clinical competence of physicians previously identified with potential clinical competence difficulties through a day-long procedure called the Structured Oral Interview (SOI). Two peer physicians produce a qualitative report. In view of remediation activities and the potential for legal consequences, more information on the clinical reasoning process (CRP) and quantitative data on the quality of that process is needed. This study examines the Script Concordance Test (SCT), a tool that provides a standardized and objective measure of a specific dimension of CRP, clinical data interpretation (CDI), to determine whether it could be useful in that endeavor.
METHODS: Over a 2-year period, 20 family physicians took, in addition to the SOI, a 1-hour paper-and-pencil SCT. Three evaluators, blind as to the purpose of the experiment, retrospectively reviewed SOI reports and were asked to estimate clinical reasoning quality. Subjects were classified into 2 groups (below and above median of the score distribution) for the 2 assessment methods. Agreement between classifications is estimated with the use of the Kappa coefficient.
RESULTS: Intraclass correlation for SOI was 0.89. Cronbach alpha coefficient for the SCT was 0.90. Agreement between methods was found for 13 participants (Kappa: 0.30, P = 0.18), but 7 out of 20 participants were classified differently in both methods. All participants but 1 had SCT scores below 2 SD of panel mean, thus indicating serious deficiencies in CDI. DISCUSSION: The finding that the majority of the referred group did so poorly on CDI tasks has great interest for assessment as well as for remediation. In remediation of prescribing skills, adding SCT to SOI is useful for assessment of cognitive reasoning in poorly performing physicians. The structured oral interview should be improved with more precise reporting by those who assess the clinical reasoning process of examinees, and caution is recommended in interpreting SCT scores; they reflect only a part of the reasoning process.

Entities:  

Mesh:

Year:  2010        PMID: 20872770     DOI: 10.1002/chp.20076

Source DB:  PubMed          Journal:  J Contin Educ Health Prof        ISSN: 0894-1912            Impact factor:   1.355


  5 in total

1.  The interrater reliability of an objective structured practical examination in measuring the clinical reasoning ability of chiropractic students.

Authors:  Kevin A Rose; Jesika Babajanian
Journal:  J Chiropr Educ       Date:  2016-04-26

2.  Case-Based Teaching: Does the Addition of High-Fidelity Simulation Make a Difference in Medical Students' Clinical Reasoning Skills?

Authors:  Mary Kathryn Mutter; James R Martindale; Neeral Shah; Maryellen E Gusic; Stephen J Wolf
Journal:  Med Sci Educ       Date:  2020-01-10

Review 3.  Clinical reasoning assessment through medical expertise theories: past, present and future directions.

Authors:  Elham Boushehri; Kamran Soltani Arabshahi; Alireza Monajemi
Journal:  Med J Islam Repub Iran       Date:  2015-06-15

4.  Preferred question types for computer-based assessment of clinical reasoning: a literature study.

Authors:  Lisette van Bruggen; Margreet Manrique-van Woudenbergh; Emely Spierenburg; Jacqueline Vos
Journal:  Perspect Med Educ       Date:  2012-10-02

5.  Effects of two different instructional formats on scores and reliability of a script concordance test.

Authors:  W E Sjoukje van den Broek; Marianne V van Asperen; Eugène Custers; Gerlof D Valk; Olle Th J Ten Cate
Journal:  Perspect Med Educ       Date:  2012-08-21
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.