Literature DB >> 25647834

Does Objective Structured Clinical Examinations Score Reflect the Clinical Reasoning Ability of Medical Students?

Wan Beom Park1, Seok Hoon Kang, Yoon-Seong Lee, Sun Jung Myung.   

Abstract

BACKGROUND: Clinical reasoning ability is an important factor in a physician's competence and thus should be taught and tested in medical schools. Medical schools generally use objective structured clinical examinations (OSCE) to measure the clinical competency of medical students. However, it is unknown whether OSCE can also evaluate clinical reasoning ability. In this study, the authors investigated whether OSCE scores reflected students' clinical reasoning abilities.
METHODS: Sixty-five fourth-year medical students participated in this study. Medical students completed the OSCE with 4 cases using standardized patients. For assessment of clinical reasoning, students were asked to list differential diagnoses and the findings that were compatible or not compatible with each diagnosis. The OSCE score (score of patient encounter), diagnostic accuracy score, clinical reasoning score, clinical knowledge score and grade point average (GPA) were obtained for each student, and correlation analysis was performed.
RESULTS: Clinical reasoning score was significantly correlated with diagnostic accuracy and GPA (correlation coefficient = 0.258 and 0.380; P = 0.038 and 0.002, respectively) but not with OSCE score or clinical knowledge score (correlation coefficient = 0.137 and 0.242; P = 0.276 and 0.052, respectively). Total OSCE score was not significantly correlated with clinical knowledge test score, clinical reasoning score, diagnostic accuracy score or GPA.
CONCLUSIONS: OSCE score from patient encounters did not reflect the clinical reasoning abilities of the medical students in this study. The evaluation of medical students' clinical reasoning abilities through OSCE should be strengthened.

Entities:  

Mesh:

Year:  2015        PMID: 25647834      PMCID: PMC4495861          DOI: 10.1097/MAJ.0000000000000420

Source DB:  PubMed          Journal:  Am J Med Sci        ISSN: 0002-9629            Impact factor:   2.378


The objective structured clinical examination (OSCE) is a useful tool for clinical performance assessment and is used worldwide as part of medical licensing examinations, for example, in the U.S. Medical Licensing Examination Step 2 Clinical Skills section. Mounting evidence validating the OSCE has led to increasingly widespread use of this tool to measure the clinical competency of medical students.1–3 In 2009, this clinical skills examination was introduced in Korea as an independent examination to be performed as part of the Korean Medical Licensing Examination. During an OSCE, the examinee compiles the history and physical examination information for a standardized patient (SP) and documents the relevant findings, differential diagnosis and plan of action in a structured patient note. The OSCE score usually comprises the following evaluation categories: history taking (30%–40%), physical examination (20%–40%), patient education (0%–10%), physician-patient interaction (20%–40%) and patient note (∼5%). Clinical reasoning ability is regarded as an important factor determining a physician's competency and thus should be taught and tested in medical schools. The OSCE is a useful tool for assessing clinical performance, but it remains unknown whether the OSCE score from a patient encounter reflects a student's clinical reasoning ability. Few studies have investigated the efficacy of an OSCE for evaluating medical students' clinical reasoning ability.4–6 Therefore, here the authors conducted an analysis to determine whether a high OSCE score from a patient encounter was significantly correlated with good clinical reasoning ability.

METHODS

In March 2011, 65 fourth-year students at Seoul National University College of Medicine (Seoul, South Korea) voluntarily participated in this study. The institutional review board approved this study and waived the requirement for written consent. The OSCE consisted of 4 stations presenting the following clinical cases: IgA nephropathy, neurogenic diabetes insipidus, acute pyelonephritis and exercise-induced asthma. Each station involved a 10-minute student-SP encounter, followed by a 5-minute interstation examination. Performances were evaluated by trained SPs using a checklist. The major evaluation components were overall assessment, history taking, physical examination, physician's manner, patient education and physician-patient interaction. After encountering patients, the students received an answer sheet presenting a table designed to evaluate the students' clinical reasoning process. During the 5-minute interstation examination, students were asked to complete the table with the differential diagnoses and the symptoms or signs that were compatible with or differed from each diagnosis. Then, the patient note was independently rated by 2 physician raters, who assessed the table and counted the number of correct findings within each section. Each correct finding counted as 1 point, and the student's clinical reasoning score was calculated as the total sum of points. Diagnostic accuracy score was calculated as the total number of correct diagnoses among 4 cases. This patient note form was considered to have content validity because its components were consistent with the literature regarding clinical reasoning skills7–11 and were chosen based on what physicians write on patient notes in clinical practice. The diagnostic accuracy score was literally a numeric index of diagnostic correctness. For each student, the authors recorded the OSCE scores from SP encounters, including history taking, physical examination and patient-physician interaction, as well as the diagnostic accuracy scores from the 4 cases. The authors also obtained the students' demographic characteristics, grade point average (GPA) and clinical knowledge test score. The clinical knowledge test investigated clinical knowledge (eg, internal medicine, surgery, psychiatry) using the same format as the Korean Medical Licensing Examination, comprising a 6-section 400-multiple-choice question examination. These collected data were subjected to correlation analyses using Pearson's correlation coefficient. Statistical analysis was performed using SPSS software (version 19.0; SPSS Inc, Chicago, IL). A P value of <0.05 was considered statistically significant.

RESULTS

A total of 65 fourth-year students underwent OSCE examination. Table 1 presents the students' demographic data and examination scores, including GPA and OSCE. Clinical reasoning score was not statistically significantly correlated with OSCE score or clinical knowledge test score (correlation coefficient = 0.137 and 0.091; P = 0.276 and 0.472, respectively) but was significantly correlated with GPA and diagnostic accuracy score (correlation coefficient = 0.380 and 0.258; P = 0.002 and 0.038, respectively) (Table 2). The total OSCE score was not significantly correlated with clinical knowledge test score, clinical reasoning score, diagnostic accuracy score or GPA. Among the components of the OSCE score, neither history taking score nor physical examination score was correlated with clinical reasoning score (correlation coefficient = 0.199 and 0.045; P = 0.112 and 0.722, respectively) (Table 2).
TABLE 1

Characteristics and examination scores of 65 medical students

TABLE 2

Pearson correlations between clinical reasoning score, GPA, clinical knowledge test score, diagnostic accuracy score and each component of OSCE score

Characteristics and examination scores of 65 medical students Pearson correlations between clinical reasoning score, GPA, clinical knowledge test score, diagnostic accuracy score and each component of OSCE score

DISCUSSION

Clinical reasoning skills may help students to better focus on the efficient history taking and physical examinations that are required for making a correct diagnosis.12 Appropriate clinical reasoning is more likely to result in appropriate history taking and physical examination, which comprises a large percentage of OSCE checklists. However, the present results showed that clinical reasoning score was not correlated with OSCE score—meaning that OSCE (especially with a checklist scoring system) could not differentiate students who asked appropriate history questions with appropriate clinical reasoning from others who asked history questions with insufficient clinical reasoning. This further suggests that some students could receive a high OSCE score simply by asking and checking memorized items without adequate reasoning. Additionally, the authors found that diagnostic accuracy was positively correlated only with clinical reasoning score and not with OSCE score or clinical knowledge score. This suggests that clinical information obtained from the patient may not be the only factor that raises the probability of correct diagnosis and that clinical reasoning may be more vital for correct diagnosis than the amount of clinical information. Some students who gained limited clinical information (low OSCE score) during an encounter still made the right diagnosis by having good clinical reasoning ability. The authors also found that GPA was significantly correlated with clinical knowledge score and clinical reasoning score. GPA is regarded as the global indicator of a student's performance across the spectrum.13,14 The results showed that GPA was positively correlated with clinical reasoning ability (correlation coefficient = 0.380; P = 0.002). The clinical knowledge test was originally designed to assess problem solving and clinical decision-making abilities; however, here the authors found that the clinical knowledge test score was not correlated with the diagnostic accuracy score or the clinical reasoning score. In contrast to the SP encounters during OSCE, which required students to actively gather the clinical information required for making an appropriate clinical decision, the clinical knowledge test was a paper examination in which students were passively given all clinical information in a paragraph so that they could interpret laboratory or radiological data and make a diagnostic or therapeutic assumption. The practice situation presented in OSCE is closer to real-world clinical reasoning than the paper examination, as physicians can only get clinical information when they properly ask. Overall, the data underline that the evaluation of clinical competency should include not only clinical knowledge and clinical performance but also clinical reasoning ability, which was not adequately reflected by the results of the presently analyzed methods. Many efforts have been made to develop a valid and reliable measure of clinical reasoning ability. These have included the use of patient management problems, modified essay questions, script concordance tests and other methods.15–18 Although clinical reasoning must be assessed in educational programs and certification processes, no study to date has uncovered a single best tool. In this study, the authors tried to assess clinical reasoning ability by simply modifying the existing OSCE examination. There may be some shortcomings of performing the OSCE examination using SPs as evaluators. It is possible that OSCE score and clinical reasoning score would be positively correlated if the authors evaluated the students' clinical performances using a global rating based on holistic traits (eg, clinical data interpretation, thought process and logic) instead of a checklist system for each item of history taking or physical examination.19 However, performance evaluation by trained SPs using a checklist is regarded as a reliable method and has been verified by many previous reports.20–22 For more precise evaluation of a student's clinical performance, a physician's observation of a student-patient encounter for a sufficient duration would be ideal. However, this could be very difficult to accomplish, especially when evaluating a large volume of students. Thus, many medical schools and license examinations use checklists to evaluate clinical performance. The interstation examination is already available for evaluating a student's ability to interpret obtained information. However, simply entering an assessment and plan was insufficient for evaluating clinical reasoning ability. The present results indicated that the checklist system for clinical performance in OSCE was very limited in its reflection of clinical reasoning ability. Therefore, the authors created a new patient note form for the interstation examination. Although global rating by experts is regarded as the “gold standard” for clinical reasoning assessment,23 here the authors used analytic scoring to evaluate clinical reasoning ability. Compared with global rating, analytic scoring is known to be an effective method of giving feedback and to have increased reliability over global ratings.4 Furthermore, because the analytic score was rated by physicians, the scoring system to evaluate clinical reasoning might be more reliable than other analytic scoring systems. This study has several limitations. First, this study was performed in a single institution and only included 65 fourth-grade students. Therefore, these results may not be generalizable to other institutions, which might have different clinical clerkship programs and student evaluation systems. Second, the authors tested just 4 cases, which may not be enough to widely generalize these conclusions. Third, the OSCE score used for the present analysis was only based on the patient encounter and did not include the score for the patient note during the interstation examination. It is possible that a different format or proportion of interstation examination could have changed the influence of clinical reasoning ability on OSCE score. Finally, the study analyzed OSCE examination data early in the fourth year. Students' OSCE scores are not usually consistent throughout the year, and the timing of the OSCE might influence the data. Although a patient note focused on assessment and planning could be considered to reflect clinical reasoning ability, the data showed that the correlation coefficient between clinical reasoning score and diagnostic accuracy (assessment) was too low to suggest that patient note was an indicator of clinical reasoning ability (r = 0.258, P = 0.038). Furthermore, because only ∼5% of total score is usually allotted to patient note in OSCE, the correlation between OSCE and clinical reasoning score would not be significantly changed if the patient encounter score was substituted with the total composite score including patient note. For assessing a student's clinical reasoning ability, the authors suggest using the presented table, which includes symptoms or signs that are compatible with or differ from each diagnosis. In addition, a structured short essay or schematic visualization that describes the student's clinical reasoning process could be an effective tool for assessing clinical reasoning ability. Furthermore, research is warranted for the development of valid methods to properly evaluate clinical reasoning ability with an OSCE. In conclusion, the present results suggest that the OSCE score from a patient encounter may not reflect a medical student's clinical reasoning ability. Efforts should be made to improve the evaluation of the clinical reasoning abilities of medical students using OSCE.
  17 in total

1.  Assessment of clinical competence using objective structured examination.

Authors:  R M Harden; M Stevenson; W W Downie; G M Wilson
Journal:  Br Med J       Date:  1975-02-22

2.  The relationship between the National Board of Medical Examiners' prototype of the Step 2 clinical skills exam and interns' performance.

Authors:  Marcia L Taylor; Amy V Blue; Arch G Mainous; Mark E Geesey; William T Basco
Journal:  Acad Med       Date:  2005-05       Impact factor: 6.893

3.  What predicts USMLE Step 3 performance?

Authors:  Dorothy A Andriole; Donna B Jeffe; Heather L Hageman; Alison J Whelan
Journal:  Acad Med       Date:  2005-10       Impact factor: 6.893

4.  Is assessment of clinical reasoning still the Holy Grail?

Authors:  Lambert Schuwirth
Journal:  Med Educ       Date:  2009-04       Impact factor: 6.251

5.  The validity of performance-based measures of clinical reasoning and alternative approaches.

Authors:  Clarence D Kreiter; George Bergus
Journal:  Med Educ       Date:  2009-04       Impact factor: 6.251

6.  Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination.

Authors:  G Regehr; H MacRae; R K Reznick; D Szalay
Journal:  Acad Med       Date:  1998-09       Impact factor: 6.893

7.  Assessing clinical reasoning skills in scenarios of uncertainty: convergent validity for a Script Concordance Test in an emergency medicine clerkship and residency.

Authors:  Aloysius J Humbert; Bart Besinger; Edward J Miech
Journal:  Acad Emerg Med       Date:  2011-06       Impact factor: 3.451

8.  Issues of validity and reliability concerning who scores the post-encounter patient-progress note.

Authors:  M F Ben-David; J R Boulet; W P Burdick; A Ziv; R K Hambleton; N E Gary
Journal:  Acad Med       Date:  1997-10       Impact factor: 6.893

9.  Measurement of physician performance by standardized patients. Refining techniques for undetected entry in physicians' offices.

Authors:  C A Woodward; G A McConvey; V Neufeld; G R Norman; A Walsh
Journal:  Med Care       Date:  1985-08       Impact factor: 2.983

10.  Evaluation of physical examination skills. Reliability of faculty observers and patient instructors.

Authors:  D L Elliot; D H Hickam
Journal:  JAMA       Date:  1987-12-18       Impact factor: 56.272

View more
  10 in total

Review 1.  Situational awareness within objective structured clinical examination stations in undergraduate medical training - a literature search.

Authors:  Markus A Fischer; Kieran M Kennedy; Steven Durning; Marlies P Schijven; Jean Ker; Paul O'Connor; Eva Doherty; Thomas J B Kropmans
Journal:  BMC Med Educ       Date:  2017-12-21       Impact factor: 2.463

2.  Assessing Clinical Reasoning: Targeting the Higher Levels of the Pyramid.

Authors:  Harish Thampy; Emma Willert; Subha Ramani
Journal:  J Gen Intern Med       Date:  2019-08       Impact factor: 5.128

3.  Five decades of research and theorization on clinical reasoning: a critical review.

Authors:  Shahram Yazdani; Maryam Hoseini Abardeh
Journal:  Adv Med Educ Pract       Date:  2019-08-27

4.  A novel model of clinical reasoning: cognitive zipper model.

Authors:  Shahram Yazdani; Maryam Hoseini Abardeh
Journal:  J Adv Med Educ Prof       Date:  2020-04

5.  Evaluating the impact of a medical school cohort sexual health course on knowledge, counseling skills and sexual attitude change.

Authors:  Michael W Ross; Carey Roth Bayer; Alan Shindel; Eli Coleman
Journal:  BMC Med Educ       Date:  2021-01-08       Impact factor: 2.463

6.  Assessing clinical reasoning ability in fourth-year medical students via an integrative group history-taking with an individual reasoning activity.

Authors:  Jian-Han Lai; Kuan-Hao Cheng; Yih-Jer Wu; Ching-Chung Lin
Journal:  BMC Med Educ       Date:  2022-07-26       Impact factor: 3.263

7.  A pilot study of marking accuracy and mental workload as measures of OSCE examiner performance.

Authors:  Aidan Byrne; Tereza Soskova; Jayne Dawkins; Lee Coombes
Journal:  BMC Med Educ       Date:  2016-07-25       Impact factor: 2.463

8.  Assessment of the Midwifery Students' Clinical Competency Before Internship Program in the Field Based on the Objective Structured Clinical Examination.

Authors:  Narges Malakooti; Parvin Bahadoran; Soheyla Ehsanpoor
Journal:  Iran J Nurs Midwifery Res       Date:  2018 Jan-Feb

9.  Fourth-year medical students' experiences of diagnostic consultations in a simulated primary care setting.

Authors:  Annamaria Witheridge; Gordon Ferns; Wesley Scott-Smith
Journal:  Int J Med Educ       Date:  2019-08-29

Review 10.  A scoping review of clinical reasoning research with Asian healthcare professionals.

Authors:  Ching-Yi Lee; Chang-Chyi Jenq; Madawa Chandratilake; Julie Chen; Mi-Mi Chen; Hiroshi Nishigori; Gohar Wajid; Pai-Hsuang Yang; Muhamad Saiful Bahri Yusoff; Lynn Monrouxe
Journal:  Adv Health Sci Educ Theory Pract       Date:  2021-07-12       Impact factor: 3.853

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.