Literature DB >> 30787752

Student and Faculty Perception of Objective Structured Clinical Examination: A Teaching Hospital Experience.

Abir H Alsaid1, Mona Al-Sheikh2.   

Abstract

INTRODUCTION: The primary objective of this study was to explore student and faculty perception of the objective structured clinical examination (OSCE) to assess the clinical competence of 5th year medical students.
METHODS: Two validated tools were used to survey students' and faculty perception of the OSCE as an assessment tool. The questionnaires were self-administered and handed to the students immediately after the OSCE was conducted. Subjects were 29 female students who had completed their 3-week Internal Medicine rotation and 15 faculty members who had participated in evaluating the students. The response rate was 100%. The OSCE comprised of 21 active stations involving skills like history taking standardized patients were used, physical examination, and data interpretation for which real patients were used, physical examination, or management. Standardized or real patients were used in 16 stations.
RESULTS: Majority of students, 63.2% indicated that the OSCE assessed their skills fairly. This was also true for 80% thought the OSCE was a fair method of assessing students' skills as well as a better assessment tool than the traditional long/short case exams.
CONCLUSION: The OSCE was positively perceived by 5th year medical students and faculty members as a tool that can fairly assess students' clinical skills.

Entities:  

Keywords:  Faculty survey; objective structured clinical examination; student survey

Year:  2016        PMID: 30787752      PMCID: PMC6298287          DOI: 10.4103/1658-631X.194250

Source DB:  PubMed          Journal:  Saudi J Med Med Sci        ISSN: 2321-4856


INTRODUCTION

There are three main intersecting areas of medical education: Curriculum design, instructional methods and assessment measures.[1] The objective structured clinical examination (OSCE) is an examination method that was hypothesized in the 1960s by Harden and first reported in the British Medical Journal.[2] It has since been used as a valuable tool to evaluate students' clinical skills in medical, dentistry, nursing and pharmacology schools worldwide. It has also been evaluated in a comparative study which aimed to asses 2 groups of final year medical students in 2 British medical schools. The result of that study highlighted that OSCE is a valid tool for assessing clinical competence among medical students and is also able to determine areas where teaching methods and or curriculum content might have contributed to students' performance in each group.[3] According to the concept of Miller's pyramid, OSCE is defined as an assessment method designed for the evaluation of clinical competency at the level of “shows how.”[4] OSCEs are conducted by means of rotating students in successive stations that assess students' skills in history taking, physical examination, communication skills, patient management, diagnosis and data interpretation. The stations are organized in such a way that it allows students to rotate smoothly in a predetermined time while being observed by an examiner on a one-to-one basis using standard patients. Standardized OSCE is a fairly new method of assessment in Saudi Arabia and is being conducted in many universities across the Kingdom. A study conducted in King Saud University in 2006 on 95 students proved the OSCE to be a highly reliable method of student assessment.[5] The Department of Internal Medicine at the University of Dammam has used this method of assessment since 2013 to assess the clinical competency of 5th year medical students. The Accreditation Council for Graduate Medical Education (ACGME), classifies medical competence into six domains: Medical knowledge (MK), patient care (PC), professionalism, interpersonal and communication skills (ICS), systems-based practice (SBP), practice-based learning and improvement (PBLI).[6] The OSCE is considered to be a valid tool for assessing PC, ICS and professionalism. It is also a reliable method for the evaluation of PBLI and SBP, but not MK.[67] Thus, OSCEs are considered by some medical educators to be the gold standard of assessment methods.[8] The OSCE allows different aspects of clinical competence to be assessed in a comprehensive, consistent, controlled, and objective manner. The development of quality assessment methods is important and needs to be dynamically improved to achieve excellence in conducting standardized OSCEs. The advantages of the OSCE are that it ensures a uniform marking scheme and consistent examination set-ups for both examiners and students. A formative OSCE allows immediate feedback, which enhances the student's learning experience and improves his/her proficiency in the following stations. The OSCE, unlike long case exams, eliminates prejudice and allows all students to go through the same criteria for assessment. It objectively assesses necessary facets of clinical competence, such as physical examination and history taking skills, as well as problem-solving, decision-making abilities, patient treatment, interactive competencies and thus has a powerful educational influence.[9] The OSCE was used to measure the performance of 117 second year medical students at the end of introductory courses at the Bowman Gray School of Medicine to test their knowledge on differential and physical diagnosis. It has been used as an assessment tool for two continuous years. Benefits were assessed through a questionnaire with an 80% response rate from the faculty involved. Despite the high expenses involved in conducting the OSCE, it was reported that it was worth the time invested in evaluating the students and encouraged its use in the future. The majority of the examinees were satisfied with the examination and thought that it should continue to be used as an assessment tool.[10] Despite the indisputable fact that the OSCE is a potent assessment tool in medical education, it also has some reported disadvantages. Although the OSCE is reported to be a reliable, valid and objective tool, it is expensive and time-consuming, which are two major drawbacks.[11] The high costs are primarily related to manpower (examiners, patients, coordinators), resources, time and space as well as the extensive organization required. The non-integrated type of OSCE assesses students on certain medical tasks, and this compartmentalizes their skills and may encourage them to evaluate patients partially rather than as a whole. Albeit, students are being assessed on different areas of knowledge and skills, it is still considered a narrow scope especially in terms of history taking and physical examination.[12]

METHODS

In April 2014, the Department of Internal Medicine in King Fahd Hospital of the University held its second OSCE for 5th year medical students. Twenty-nine female students were examined in the short stay ward. The OSCE included 21 active stations, comprising 10 history-taking stations, 6 physical examination stations and 5 data interpretation stations [Table 1]. The examination lasted for 174 min.
Table 1

Blueprint Objective Structured Clinical Examination April 2014

Content areaSkills assessed

History takingPhysical examinationData interpretation
CardiologyXXX
GastroenterologyXXX
PulmonologyXX
RheumatologyXX
EndocrinologyXX
HematologyXX
Infectious diseaseXX
General medicineXX
NephrologyXX
Communication skillsX
Blueprint Objective Structured Clinical Examination April 2014 After the exam, the students were handed the questionnaire [Table 2]. A different survey was given to 15 participating faculty members [Table 3].
Table 2

Participant survey evaluation of the objective structured clinical examination experience

Please indicate your response to each of the three questions listed in the column

NumberCaseHow much did you learn from doing this case?How would you rate your overall performance in this case?


NothingSomeA lotPoorFairGoodExcellent
1Cardiology history
2Cardiology physical examination
3Cardiology data interpretation
4GI history
5GI physical examination
6GI data interpretation
7Rheumatology history
8Rheumatology physical examination
9Pulmonary data interpretation
10Hematology history
11Hematology data interpretation
12ID history
13ID data interpretation
14General examination
15General medicine
16Communication Skills
17Endocrinology history
18Endocrinology physical examination
19Nephrology history
20Nephrology physical examination
21Pulmonary history

Please indicate how much you agree or disagree with each of the items below

NumberIn general this OSCE…Strongly disagreeSomewhat disagreeSomewhat agreeStrongly agree

1Helped me identify my strengths and weaknesses
2Stimulated me so that I will go and learn more about some of the topics covered
3Taught me something new
4Provided me with valuable feedback
5Was a lot like real life clinical encounters
6Evaluated my skills fairly
7Was enjoyable
8Provided a good cross section of general medicine
9Was an experience I would like to have again

Adopted from a book titled “Objective Structured Clinical Examinations-10 steps to planning and implementing OSCEs and other standardized patient exercises.” OSCEs – Objective structured clinical examinations; GI – Gastrointestinal

Table 3

Faculty survey evaluation of the objective structured clinical examination experience

Please indicate your response to each of the three questions listed in the column

Name of faculty memberCaseHow hard was this case for the students?How much will residents learn from this case?How would you rate the overall student performance in this case?



Too easyJust rightToo hardNothingSomeA lotPoorFairGoodExcellent

Please indicate how much you agree or disagree with each of the items below

NumberIn general this OSCE…Strongly disagreeSomewhat disagreeSomewhat agreeStrongly agree

1Helped students identify their strengths and weaknesses
2Stimulated students to go and learn more about some of the topics covered
3Taught students something new
4Provided students with valuable feedback
5Provided me with new information about students’ performance level
6Gave me some new ideas for teaching
7Was a lot like real life clinical encounters
8Evaluated students’ skill fairly
9Was enjoyable
10Provided a good cross section of general medicine
11Was an experience I, as faculty, would like to have again

Adopted from a book titled “Objective Structured Clinical Examinations-10 steps to planning and implementing OSCEs and other standardized patient exercises.” OSCEs – Objective structured clinical examinations

Participant survey evaluation of the objective structured clinical examination experience Adopted from a book titled “Objective Structured Clinical Examinations-10 steps to planning and implementing OSCEs and other standardized patient exercises.” OSCEs – Objective structured clinical examinations; GI – Gastrointestinal Faculty survey evaluation of the objective structured clinical examination experience Adopted from a book titled “Objective Structured Clinical Examinations-10 steps to planning and implementing OSCEs and other standardized patient exercises.” OSCEs – Objective structured clinical examinations The students' surveys were based on a self-assessment of their learning experience as well as their opinion on the level of exposure they had to similar cases during their rotation and their perception on the level of their performance. The faculty surveys captured their thoughts on the educational value OSCE provided to the students and if it offered any added value in the evaluation of students' knowledge levels. Both surveys included an item to assess the comfort level during the OSCE. This research was approved by the Medical Ethics Committee.

RESULTS

The surveys were collected from the 29 students and 15 faculty members and analyzed, with no other variables being considered. Two main observation were made; first 63.2% of students and 80% of faculty concurred that the OSCE was a fair assessment of clinical skills. The OSCE was perceived as a better assessment tool than the traditional long/short case exams by 80% of faculty [Figures 1 and 2].
Figure 1

Students' feedback

Figure 2

Faculty feedback

Students' feedback Faculty feedback Another finding was the amount of clinical exposure and its impact on students' perception of their performance in the exam. The majority of students were exposed to General Medicine and Gastroenterology examinations, 68.4% and 57.9%, respectively. The results showed that 89.5% of students had no exposure to pulmonary data interpretation (chest x-ray) and 84.2% had no exposure to nephrology physical examination and hematology history-taking [Figure 3]. The rating scale of performance was divided into excellent, good, fair and poor. When asked about their perception on their performance in the exam, 52.6% thought they performed well in the gastrointestinal examination station, whereas 47.4% reported a fair performance in the general medical examination.
Figure 3

Students feedback on their pre examination clinical exposure to similar cases

Students feedback on their pre examination clinical exposure to similar cases About 31.6% of students thought they did well, and another 31.6% reported a fair performance in the hematology history taking station whereas, 52.6% ranked their performance as fair in the pulmonary data interpretation [Figure 4].
Figure 4

Students' feedback on their performance in objective structured clinical examination stations

Students' feedback on their performance in objective structured clinical examination stations

DISCUSSION

The traditional clinical examinations and the written examinations test a limited range of cognitive and clinical skills. The traditional clinical exam, which depends on two examiners observing and testing mainly history-taking and physical examination and clinical reasoning skills of a student, has been deemed unreliable due to the margin of variability between the two examiners. Therefore, it was necessary to change this approach to reliably assess the other essential skills a medical student ought to have. According to a study published in the Archives of Disease in Childhood, which assessed 229 final year medical students, a more positive correlation was found between OSCE and other forms of assessment, and showed little correlation between the OSCE and viva voce results. This supports OSCE as an acceptable alternative if not superior to traditional assessment tools.[13] A study was published in the Saudi Medical Journal, which assessed 64 students undergoing their final year surgical clerkship. This study found that the OSCE is a reliable and a valid format for testing clinical skills.[14] A comparison between the performance of 3rd year medical students in the OSCE and their subsequent performance in clinical examinations in year 4 and 5 of the course was explored in the Medical Education Journal, revealing that the OSCE predicted the students' performance in a subsequent clinical examination, thereby proving that the OSCE is a valid assessment tool.[15] Our study showed that OSCE is perceived as a fair assessment tool by both students and faculty members [Figure 1]. This study was useful in shedding light on clinical training and the level of students' exposure to certain cases during their clinical rotation prior to entering the examination and their perception of their performance. The majority of students concurred that they were most exposed to general medical (68.4%) and gastroenterology examinations (57.9%) during their clinical rotation. There was obvious lack of exposure to pulmonary data interpretation (chest x-ray), nephrology examination and in hematology history-taking [Figure 3]. A valid explanation may be that general examination is covered in every single bedside teaching session and is, therefore, a technique that the students have mastered. Whereas, the nephrology examination is not covered by all academic staff. Another reason is that patients with renal disease are usually too sick to tolerate being examined by large groups of medical students and physical signs of renal disease are sparse. Lack of exposure to hematology history-taking is surprising since a considerable number of sickle cell disease patients are admitted to the internal medicine ward. With regard to pulmonary data interpretation, namely chest x-ray, it is usually not covered during bedside teaching and is mainly discussed with groups of students who are being taught by a pulmonologist or a general medicine faculty. The students' perceptions of their performance in examination stations followed the same trend as their exposure to the relevant subjects. One exception stood out, which was the general medicine station where 68% of the students were exposed to general medicine cases. However, their perception of their performance scored a satisfaction rate of only 37% [Figure 5]. This triggers a question about the design of the general medicine station, clarity of instructions, as well as the individual variability among examiners.
Figure 5

Exposure prior to objective structured clinical examination and students' perception of their performance

Exposure prior to objective structured clinical examination and students' perception of their performance In order for OSCEs to be variable and reliable, a careful review of test content and design, training raters, and, as well as implementation factors must be made.[16] A study in Mashhad University of Medical Sciences concluded that the majority of the students (94.5%) had a positive perception of the OSCE. The OSCE as a method of assessment can be recommended if standardization can be achieved and continued.[17] Overall, surveys used to evaluate the OSCE showed that undergraduate students from various medical schools perceived it positively though certain negative observations such as stress and difficulty were repeatedly stated by them.[18] A study from King Khalid University in Abha explored students' acceptance of the OSCE as a Method of assessment of clinical competence in internal medicine using self-administered surveys. The majority perceived the OSCE in internal medicine as fair (53%) and comprehensive (56%), albeit stressful.[19]

CONCLUSION

The significance of students' feedback regarding the usage of assessment tools in the undergraduate medical education is being increasingly recognized and their view on the methods used to assess their skills and their understanding of their curriculum is regarded as an efficient method toward a more successful teaching approach.[20] The results of this study revealed that students assessed by OSCE were generally satisfied, as indicated by their positive feedback which can be utilized to improve our performance in setting a standardized OSCE. Our study was limited by sample size which can be increased in future studies to improve the generalizability of our results. This is still a growing field at the University of Dammam and is a promising area of research. The University of Dammam intends to build on the results of this study to further improve the OSCEs, which will be implemented as an assessment tool in all clinical years.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.
  19 in total

1.  Predictive validity and estimated cut score of an objective structured clinical examination (OSCE) used as an assessment of clinical skills at the end of the first clinical year.

Authors:  Iain G Martin; Brian Jolly
Journal:  Med Educ       Date:  2002-05       Impact factor: 6.251

2.  Assessing the ACGME general competencies: general considerations and assessment methods.

Authors:  Susan R Swing
Journal:  Acad Emerg Med       Date:  2002-11       Impact factor: 3.451

3.  Assessment of clinical competence using objective structured examination.

Authors:  R M Harden; M Stevenson; W W Downie; G M Wilson
Journal:  Br Med J       Date:  1975-02-22

Review 4.  Critiques on the Objective Structured Clinical Examination.

Authors:  A Barman
Journal:  Ann Acad Med Singapore       Date:  2005-09       Impact factor: 2.473

5.  Reliability, validity, and feasibility of the Objective Structured Clinical Examination in assessing clinical skills of final year surgical clerkship.

Authors:  Mohammed Y Al-Naami
Journal:  Saudi Med J       Date:  2008-12       Impact factor: 1.484

6.  Validity evidence for an OSCE to assess competency in systems-based practice and practice-based learning and improvement: a preliminary investigation.

Authors:  Prathibha Varkey; Neena Natt; Timothy Lesnick; Steven Downing; Rachel Yudkowsky
Journal:  Acad Med       Date:  2008-08       Impact factor: 6.893

7.  Objective structured clinical exams: a critical review.

Authors:  John L Turner; Mary E Dankoski
Journal:  Fam Med       Date:  2008-09       Impact factor: 1.756

8.  Is OSCE valid for evaluation of the six ACGME general competencies?

Authors:  Cho-Yu Chan
Journal:  J Chin Med Assoc       Date:  2011-04-08       Impact factor: 2.743

9.  Improvement of psychometric properties of the objective structured clinical examination when assessing problem solving skills of surgical clerkship.

Authors:  Mohammed Y Al-Naami; Omer F El-Tinay; Gamal A Khairy; Safdar S Mofti; Muhammad N Anjum
Journal:  Saudi Med J       Date:  2011-03       Impact factor: 1.484

10.  The need for national medical licensing examination in Saudi Arabia.

Authors:  Sohail Bajammal; Rania Zaini; Wesam Abuznadah; Mohammad Al-Rukban; Syed Moyn Aly; Abdulaziz Boker; Abdulmohsen Al-Zalabani; Mohammad Al-Omran; Amro Al-Habib; Mona Al-Sheikh; Mohammad Al-Sultan; Nadia Fida; Khalid Alzahrani; Bashir Hamad; Mohammad Al Shehri; Khalid Bin Abdulrahman; Saleh Al-Damegh; Mansour M Al-Nozha; Tyrone Donnon
Journal:  BMC Med Educ       Date:  2008-11-25       Impact factor: 2.463

View more
  5 in total

1.  Perceptions of Physical Therapy Students on their Psychomotor Examinations: a Qualitative Study.

Authors:  Kelly Macauley; Stephanie Laprino; Tracy Brudvig
Journal:  Med Sci Educ       Date:  2022-02-04

2.  Perception of Students and Examiners about Objective Structured Clinical Examination in a Teaching Hospital in Ethiopia.

Authors:  Henok Fisseha; Hailemichael Desalegn
Journal:  Adv Med Educ Pract       Date:  2021-12-11

3.  Objective structured clinical examination: Challenges and opportunities from students' perspective.

Authors:  Nazdar Alkhateeb; Abubakir Majeed Salih; Nazar Shabila; Ali Al-Dabbagh
Journal:  PLoS One       Date:  2022-09-02       Impact factor: 3.752

4.  Assessment of Postgraduate Online Medical Education During the COVID-19 Pandemic in Saudi Arabia: A Cross-Sectional Study.

Authors:  Khalid AlMatham; Adnan AlWadie; Omar Kasule; Sara AlFadil; Osama Al-Shaya
Journal:  Adv Med Educ Pract       Date:  2022-09-23

5.  Experience and Challenges of Objective Structured Clinical Examination (OSCE): Perspective of Students and Examiners in a Clinical Department of Ethiopian University.

Authors:  Getu Ataro; Solomon Worku; Tsedeke Asaminew
Journal:  Ethiop J Health Sci       Date:  2020-05
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.