Literature DB >> 31435304

Dental students' perception of the Objective Structured Clinical Examination (OSCE): The Taibah University experience, Almadinah Almunawwarah, KSA.

Ahmad A Al Nazzawi1.   

Abstract

OBJECTIVE: The Objective Structured Clinical Examination (OSCE) is an assessment tool used to objectively evaluate clinical competence in medical schools and, more recently, in the nursing profession as well. However, few studies have been conducted to elicit the views of dental students in KSA regarding the OSCE. The present study was designed to explore, evaluate, and analyse students' perceptions of the OSCE and to explore the strengths and weaknesses of this assessment tool as perceived by dental students.
METHODS: This was a cross-sectional analytical observational study using a survey design. It took place in the College of Dentistry, Taibah University, KSA. The study sample consisted of third, fourth, and fifth-year students who took the OSCE assessment during their studies.
RESULTS: Of 138 invited students, 119 responded (response rate of 86.2%). Almost half the students (47.1%) agreed that the OSCE exam was fair, while 19.3% agreed that a broad range of clinical skills were included in the OSCE exam. A low percentage of students believed to a great extent that the scores were standardised; however, the practicality and usefulness of the OSCE was questioned. Students perceived that issues pertaining to personality, ethnicity, and gender would not impact the scores.
CONCLUSION: The findings of this study indicated that the OSCE is a meaningful and fair assessment tool for clinical skills.

Entities:  

Keywords:  Clinical skills; Dental education; KSA; OSCE; Student perception

Year:  2017        PMID: 31435304      PMCID: PMC6695011          DOI: 10.1016/j.jtumed.2017.09.002

Source DB:  PubMed          Journal:  J Taibah Univ Med Sci        ISSN: 1658-3612


Introduction

Judging a student's clinical competencies, namely their ability to do something successfully and efficiently, is a necessity in the process of education. On the other hand, students gain various unparallelled experiences that are problematic in calculating the outcome of an individual and the programme itself. For that reason, it is necessary to use multiple evaluation methods to better understand dental students' clinical competence and to place greater importance on the methods that motivate the learning of clinical skills, while simultaneously contributing a fitting mechanism for appraising them. One such mechanism is the Objective Structured Clinical Examination (OSCE),2, 3 which involves using simulated clinical situations to conduct summative evaluations of trainee skills.4, 5 It was created to improve the analysis and estimation of students' acquisition of clinical skills. The OSCE is becoming more prevalent in healthcare education programmes, because it is regarded as a useful method for assessing skills and underpinning the knowledge required for practice. Utilising the OSCE as an evaluation strategy for dental students' clinical ability has been an essential part of the general evaluation method. The OSCE has been used in the evaluation of students in medical schools for more than 20 years. Over the past 10 years, interest in this type of evaluation in other health professions such as physiotherapy, dentistry, and nursing has grown. The OSCE provides an innovative learning experience for students. It offers a valid means to holistically evaluate students' clinical performance.1, 9 Pierre et al. added that the OSCE sessions perceived strengths, weaknesses, and challenges in clinical competence, fostered self-assessment skills, and provided direction for programme training needs. The College of Dentistry, Taibah University in Almadinah Almunawwarah, KSA was established in 1426 H (2005 G). It accepted its first group of students in the academic year 1429–1430 H (2008–2009 G). Since then, the OSCE has been considered a method of formative assessment of students in the clinical years. The college offers high-quality dental services to its patients, a true addition to the services Taibah University provides to the community. This study aims to assess and analyse dental students' views of the OSCE, and in addition, to investigate the qualities and shortcomings of this evaluation instrument as perceived by the subjects. The objectives of the investigation included an evaluation of students' impressions of the properties of the examination, which incorporate the following: Organisational and instructional quality Performance quality Validity of the OSCE as an evaluation instrument compared to other evaluation methods

Materials and Methods

This was a cross-sectional analytical observational study using a survey design to obtain data related to the current research. This design provides the basis to further improve programmes and interventions. This study was conducted at the College of Dentistry, Taibah University, KSA. The study sample consisted of students subjected to the OSCE exams, including third, fourth, and fifth-year students. In total, 138 male and female students were enrolled in these academic years. The sample included all students registered for the last three clinical years of study. For the purpose of this study, the questionnaires were delivered in the Arabic language. The questionnaire consisted of a section pertaining to respondents' demographic data and the following tools: A modified self-administered version of the questionnaire from a study by Pierre, Wierenga, Barton, Branday, and Christie (2004) was used. This is a standardised, valid, and reliable questionnaire, which was translated and validated by Al Zeftawy et al. The principle result measures of this questionnaire were students' perception of the examination aspects, which involved the quality of organisation and instructions, quality of performance, and efficiency of the OSCE as an assessment instrument compared to other formats. A four-point scale, which showed the degrees of agreement, was used to determine most dimensions in the questionnaire. Furthermore, the rating for difficulty, fairness, degree of learning, and preferred frequency of the use of the OSCE in relation to other assessment formats was measured on a three-point scale. Once ethical approval from the Taibah University College of Dentistry Research Ethics Committee (TUCD-REC) was granted to carry out this study, the investigator contacted participants to explain its aim and purpose.

Ethical consideration

This study was revised and approved by the TUCD-REC, with approval number TUCDREC/20170404/AlNazzawi. The students were guaranteed that participation was voluntary, that answers would be kept confidential and anonymous, and that those who refused involvement in the survey would not be penalised. Students who agreed to participate were given the demographic data sheet and self-administered questionnaire to evaluate their perception of the OSCE as an assessment strategy.

Data analysis

The data was analysed using the SPSS statistical package version 18. The acquired data were coded, analysed, and tabulated. Descriptive, parametric, and nonparametric statistical analyses were carried out accordingly. A qualitative analysis was conducted through a form of content analysis by identifying themes in participants' responses and grouping responses according to thematic content.

Results

From the 138 students targeted, responses were received from 119, representing a response rate of 86.2%. The mean number of the OSCE exam was 11.82 ± 4.7. Of the 119 students who participated, 63 (52.9%) were male and 56 (47.1%) were female.

The OSCE evaluation

As shown in Table 1, almost half of the students agreed that the OSCE exam was fair (47.1%). However, half disagreed that the OSCE covered a wide area of knowledge (50.4%). Around two-thirds of the students agreed that more time was needed at the stations (68.9%), and only 21% agreed that the exam was well administered. Regarding the stress faced during the exam, 62.2% agreed; however, only 22.7% agreed that it was less stressful than other types of exams, and when asked if it was intimidating, 62.2% agreed. More than half of our study participants disagreed that the exam was well structured and sequenced.
Table 1

Dentistry students' evaluation of the attributes of the OSCE.

QuestionAgree
Disagree
Neutral
No comment
N%N%N%N%
1Exam was fair5647.14336.11815.121.7
2Wide knowledge area covered3327.76050.4252110.8
3Needed more time at stations8268.92016.81512.621.7
4Exams are well administered25215344.53731.132.5
5Exams are very stressful7462.21512.62117.665
6Exams are well structured and sequenced25216756.32621.810.8
7Exam minimised chance of failing2420.26352.92924.421.7
8OSCE is less stressful than other exams2722.75949.63226.900.0
9Allowed students to compensate in some areas3025.25243.73529.400.0
10Highlighted areas of weakness252150424336.110.8
11Exam is intimidating7462.22117.82117.821.7
12Students are aware of the level of information needed2218.56050.43529.421.7
13Wide range of clinical skills covered2319.35344.53630.375.9
Dentistry students' evaluation of the attributes of the OSCE. It was noticed that 50.4% of the students were unaware of the level of information needed. Only 19.3% agreed that a broad range of clinical skills were included in the OSCE exam. In relation to the dental student evaluation of the quality of OSCE performance, Table 2 shows that more than half (54.6%) the students were fully aware of the nature of the exam, but only a quarter thought that these procedures reflected what was taught during the course (23.5%). Again, only 26.9% agreed that the time spent at each station was adequate. Around one-third of students agreed that the setting, sequences, and context at each station felt authentic.
Table 2

Dentistry students' evaluation of the quality of performance of the OSCE.

QuestionTo a great extent
Neutral
Not at all
N%N%N%
1Fully aware of the nature of the exam6554.63025.22420.2
2Tasks reflected those taught2823.56857.12319.3
3Time at each station was adequate3226.93226.95445.4
4Setting and context at each station felt authentic3327.74537.84033.6
5Instructions were clear and unambiguous3428.84538.13933.1
6Tasks asked to perform were fair2722.744374840.3
7Sequence of stations was logical and appropriate3327.74336.14336.1
8Exam provided opportunities to learn3327.74537.84134.5
Dentistry students' evaluation of the quality of performance of the OSCE. A similar low percentage of participating students agreed that the instructions were clear, the tasks to perform were fair, the sequence of the stations logical, and that the exam provided an opportunity to learn (percentages ranging from 22.7% to 27.2%).

Perception of validity and reliability

Table 3 shows that a low percentage of students believed to a great extent that the scores were standardised; that the practicality and usefulness of the OSCE was questioned; and that personality, ethnicity, and gender would not impact the scores (21.8%, 20.2%, and 23.5% respectively).
Table 3

Dentistry students' perception of the validity and reliability of the scoring and objectivity of the OSCE.

QuestionTo great extent
Neutral
Not at all
N%N%N%
1OSCE exam scores provide a true measure of essential clinical skills in dentistry3529.44638.73831.9
2OSCE scores are standardised2621.869582420.2
3OSCE provides a practical and useful experience2420.26655.52924.4
4Personality, ethnicity, and gender will not affect OSCE scores2823.55445.43731.1
Dentistry students' perception of the validity and reliability of the scoring and objectivity of the OSCE. One-third of the students believed to a great extent that the exam scores provided true measures of crucial clinical skills in dentistry.

Comparing assessment formats

Table 4 describes student responses when asked to rate the following assessment instruments to which they had been exposed: multiple-choice questions (MCQs), essays, short answer questions (SAQs), general clerkship rating, and the OSCE. The majority of students believed the MCQs to be the easiest, and only 7.6% considered the OSCE exam the easiest.
Table 4

Students rating of assessment formats.

QuestionTotal
Male
Female
P-value
N%N%N%
1. Which of the following formats is the easiest?
 MCQ9579.83961.956100<0.001
 Essay/SAC75.9711.100.0
 OSCE97.6914.300.0
 Clerkship rating86.7812.700.0
2. Which of the following formats is the fairest?
 MCQ8168.62743.55496.4<0.001
 Essay/SAC75.9711.300.0
 OSCE65.169.700.0
 Clerkship rating2420.32235.523.6
3. From which of the following formats do you learn most?
 MCQ7765.82439.35394.6<0.001
 Essay/SAC86.8813.100.0
 OSCE1210.3111811.8
 Clerkship rating2017.11819.523.6
4. Which of the following formats should be used more often in the clinical years of the programme?
 MCQ8873.93352.45598.2<0.001
 Essay/SAC21.723.200.0
 OSCE1411.81422.200.0
 Clerkship rating1512.61422.211.8
Students rating of assessment formats. A statistically significant difference was found between males and females, as all females (100%) believed that MCQs are the easiest type of assessment compared to 61.9% of males (P < 0.001). Furthermore, 69% believed that MCQs are the fairest format of assessment, followed by clerkship rating (20.3%). In addition, there was a statistically significant difference between males and females, as more females believed that MCQs are the fairest format (96.4% females compared to 43.5% of males with a P value < 0.001). Again, 65.8% of students thought that the MCQ format enables the most learning. Here, the responses of females (94.6%) differed significantly from those of males (39.3%, P < 0.001) as well. The majority of students preferred the MCQ format in the clinical years of the programmes, with a statistical difference between male and female students. Females preferred the MCQs (98.2%) more than did male students (55% with a P value of <0.001).

Discussion

Nowadays, the OSCE is considered one of the most effective tools to assess clinical skills in most clinical specialities.15, 16 The findings of this study indicated that student impressions regarding evaluation and perception of the quality, validity, and reliability of the OSCE was lower than expected, unlike other studies, which revealed positive feedback from students on the attributes of the OSCE.10, 16, 17, 18 According to the current study, almost half the students agreed that the OSCE exam was fair, and around two-thirds that it is well administered, which was in accordance with the results of other studies.14, 16, 18 Interestingly, around a quarter of the students believed that the OSCE covers a wide area of knowledge. A high percentage viewed it as very stressful. Similar findings were recorded by Mater et al. (2014), Eftekar et al. (2012), and Ali et al. (2012), who found that half, two-thirds, and the majority of their studied students considered the OSCE as a fair and stressful exam.19, 20, 21 In this study, around two-thirds of the respondents reported that the OSCE exam is very stressful and intimidating (62.2% and 62.2% respectively). These findings are aligned to those of many studies,14, 18 which found that a significant percentage of the subjects perceived the OSCE as a stressful and intimidating experience. This was also reported by Pierre et al. (2004) and Ryan et al. (2007).10, 22 Anxiety and lack of confidence were associated with inadequate preparation for the examination, which may have influenced student perception of the OSCE, especially of students who completed the questionnaire after undertaking the exam. Therefore, stress and fatigue should also be considered. Regarding the quality of OSCE performance, more than half the respondents reported being fully aware of the nature of the exam (54%), but scores for the remaining parameters were considerably low, ranging between 23.5% and 28.8%. This was in accordance with that found by Dharma in 2014, and in contrast to the results of various other studies.18, 19, 24 Students' perception regarding the validity and reliability of OSCE scoring ranged between ‘neutral’ to ‘agree to a great extent’. Only one-third of the sample did not agree that the OSCE scores were true or standardised, and provided useful measures and practical experience of essential skills in dentistry. Nevertheless, Mitchell et al. (2009) reported that contrary to various positive specifications of the OSCE, it is required to better assess clinical skills. Our findings are consistent with those of Delavar et al. (2013), namely that student perception regarding the validity and reliability of the OSCE was low and unsatisfactory. In a study conducted in Egypt, more than one-third of students who were neutral about the tasks of the exam reflected this thought, and considered the sequence of stations as logical and appropriate. Furthermore, nearly one-third were neutral about whether the setting and context at each station felt authentic, and whether the exam provided an opportunity to learn. This is similar to what has been found in this study. Interestingly, the results of this study regarding the ratings for assessment format denoted that students preferred MCQs, as they were considered easy and enabled learning the most. They highlighted that the MCQs should be used more often in the clinical years of the programme. This aligned with the results of Eswi et al. (2013), who found that more than half of the students reported MCQs as the easiest and fairest assessment format. This may be attributed to the fact that it was easier for the students to obtain marks via this tool compared to essays/SAQs, the OSCE, or clerkship rating.

Conclusion

To summarise, the findings of this study lead to the conclusion that the OSCE is a meaningful and fair assessment tool for clinical skills. The quality of process, validity, and reliability are neutral in terms of students' perception, which may be attributed to their results and scoring. It was also found that MCQs were the most preferred form of assessment from the viewpoint of students. In addition, students perceive that the OSCE is not the fairest form of assessment.

Recommendation

Preparing students for the OSCE should be emphasised more, and it should be considered as an integral part of the clinical evaluation system and as the main method for evaluating clinical practice and skills. It should be ensured that clear instructions are provided and all competencies and training are revised for staff before they start preparing for the OSCE.

Conflict of interest

The author has no conflict of interest to declare.
  16 in total

1.  Assessing nurse practitioner students using a modified objective structured clinical examination (OSCE).

Authors:  A D Khattab; B Rawlings
Journal:  Nurse Educ Today       Date:  2001-10       Impact factor: 3.442

2.  Nursing students' and lecturers' perspectives of objective structured clinical examination incorporating simulation.

Authors:  Guillaume Alinier
Journal:  Nurse Educ Today       Date:  2003-08       Impact factor: 3.442

3.  How to evaluate the acquisition of clinical skills at medical school: a tough question.

Authors:  Isabela M Benseñor
Journal:  Sao Paulo Med J       Date:  2004-07-01       Impact factor: 1.044

4.  Student self-assessment in a paediatric objective structured clinical examination.

Authors:  R B Pierre; A Wierenga; M Barton; K Thame; J M Branday; C D C Christie
Journal:  West Indian Med J       Date:  2005-03       Impact factor: 0.171

Review 5.  Objective structured clinical examination (OSCE): review of literature and implications for nursing education.

Authors:  Helen E Rushforth
Journal:  Nurse Educ Today       Date:  2006-10-27       Impact factor: 3.442

6.  Assessment of clinical nurse specialists in rheumatology using an OSCE.

Authors:  Sarah Ryan; Kay Stevenson; Andrew B Hassell
Journal:  Musculoskeletal Care       Date:  2007-09

7.  Evaluation of undergraduate students using Objective Structured Clinical Evaluation.

Authors:  Dorothy Devine Rentschler; Jeffrey Eaton; Joyce Cappiello; Sunny Fenn McNally; Paula McWilliam
Journal:  J Nurs Educ       Date:  2007-03       Impact factor: 1.726

8.  A model for integrated assessment of clinical competence.

Authors:  Karen J Panzarella; Andrea T Manyon
Journal:  J Allied Health       Date:  2007

9.  A child health nursing objective structured clinical examination (OSCE).

Authors:  Joan Walters; June Adams
Journal:  Nurse Educ Pract       Date:  2002-12       Impact factor: 2.281

10.  Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica.

Authors:  Russell B Pierre; Andrea Wierenga; Michelle Barton; J Michael Branday; Celia D C Christie
Journal:  BMC Med Educ       Date:  2004-10-16       Impact factor: 2.463

View more
  4 in total

1.  Clinical Examination among Medical Students: Assessment and Comparison of the Strengths and Weaknesses of Objective Structured Clinical Examination and Conventional Examination.

Authors:  Tunde Talib Sholadoye; Musliu Adetola Tolani; Muhammad Balarabe Aminu; Hussaini Yusuf Maitama
Journal:  Niger J Surg       Date:  2019 Jul-Dec

2.  Nursing Students' Perception and Attitude towards Objective Structured Clinical Examination in Oman.

Authors:  Shaikha Alamri; Iman Al Hashmi; Kholah Shruba; Suad Jamaan; Zaina Alrahbi; Thuraiya Al Kaabi
Journal:  Sultan Qaboos Univ Med J       Date:  2022-08-25

3.  Objective structured clinical examination: Challenges and opportunities from students' perspective.

Authors:  Nazdar Alkhateeb; Abubakir Majeed Salih; Nazar Shabila; Ali Al-Dabbagh
Journal:  PLoS One       Date:  2022-09-02       Impact factor: 3.752

4.  Quality Control in the Clinical Medical Laboratory Based on Mobile Medical Edge Computing.

Authors:  Guorong Wang; Yukun Xue; Jie Lv
Journal:  Contrast Media Mol Imaging       Date:  2022-09-27       Impact factor: 3.009

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.