Literature DB >> 28757737

Conceptualizing workplace based assessment in Singapore: Undergraduate Mini-Clinical Evaluation Exercise experiences of students and teachers.

Sabrina Lau Yanting1, Annushkha Sinnathamby1, DaoBo Wang1, Moses Tan Mong Heng1, Justin Leong Wen Hao1, Shuh Shing Lee2, Su Ping Yeo2, Dujeepa D Samarasekera2.   

Abstract

OBJECTIVES: The Mini-Clinical Evaluation Exercise (mini-CEX) is one of the most commonly used clinical assessment tools to provide learner feedback to drive learning. High quality constructive feedback promotes development and improves clinical competency. However, the effectiveness of feedback has not been objectively evaluated from the learners' and assessors' points of view, especially in Asia, where the nature of the student-tutor relationship is relatively hierarchical. This study seeks to compare the strengths, limitations, and feedback of the mini-CEX between assessors and students.
MATERIALS AND METHODS: A cross-sectional study was conducted among 275 senior medical undergraduates at the National University of Singapore and 121 clinical tutors from seven restructured hospitals in Singapore. Data was collected via a self-administered questionnaire. Univariate analysis was used to determine the prevalence of responses, as well as differences between tutors and students.
RESULTS: The mini-CEX provided immediate feedback and timely correction of mistakes. However, effective administration was limited by inter-tutor variability and lack of time. Students reported being receptive to feedback, but tutors disagreed and felt that students were resistant to negative feedback. Additionally, students felt that their performance was compared unfairly against more senior students, although the tutors felt otherwise.
CONCLUSION: The mini-CEX is an effective assessment tool, but is limited by barriers to administration and evaluation. Differing opinions and expectations between tutors and students could provide an interesting focal point for future studies.

Entities:  

Keywords:  Mini-CEX; Perspectives on clinical assessment; Workplace-based assessment

Year:  2016        PMID: 28757737      PMCID: PMC5442920          DOI: 10.1016/j.tcmj.2016.06.001

Source DB:  PubMed          Journal:  Ci Ji Yi Xue Za Zhi


1. Introduction

The concept of formal assessment in medical education was first developed in the 18th century in French and Viennese medical schools. Since then, assessment has played a pivotal role in medical education, incorporating multiple components such as “testing, measuring, collecting, combining information and providing feedback” to stimulate learning and provide data on educational efficiency [1]. Assessment processes and formats in medical schools have undergone many reforms based on changing institutional needs and future practice. In addition, the focus of assessment has shifted from “assessment of learning” to “assessment for learning” in recent years [2]. The former evaluates students at the end of a predetermined study period based on acquisition of required skills and content knowledge. For “assessment for learning”, tools like multidomain feedback are employed to identify the strengths and weaknesses of an individual student, thereafter using this information to enhance learning and maximize competency. Subsequently, a variety of evidence-based tools have been developed to assess medical students in various competencies–especially in the clinical learning environment. An example would be the Mini-Clinical Evaluation Exercise (mini-CEX), which begins with an examiner directly observing a student in a clinical encounter. The examiner then evaluates the individual's performance on a rating form and provides the student with immediate feedback, often in a formative manner. Many studies have identified the benefits of the tool, which has been found to be reliable and valid [3]. Apart from having a higher fidelity, it is also more time-efficient compared with other clinical assessment tools [34]. Nevertheless, the mini-CEX can only be a useful assessment tool with high-quality, constructive feedback. Various studies from different parts of the world have highlighted several issues, such as inter-assessor variability and reliability, poor assessor understanding of their role, and limited feasibility, which impede productive feedback and influence students’ learning. In order to solve these issues, understanding of the feedback from learners’ and assessors’ perspectives is crucial, but research in this area is scarce. Although the mini-CEX has been widely used and explored in postgraduate training, it is relatively under-investigated in the undergraduate setting. Furthermore, in Singapore, where the student–tutor relationship is relatively hierarchical, real time feedback between clinical teachers and students and acceptance of feedback by students in the rapid integration of the mini-CEX into medical school assessments can be challenging. Therefore, this study aims to compare undergraduate medical students’ and assessors’ perspectives of feedback on the mini-CEX, identify potential issues that would be of interest to the wider community, and provide a novel and more holistic perspective of the subject matter. It is hoped that the findings from this study will contribute to the existing literature by providing possible underlying reasons for mini-CEX-related issues, and thus provide for future improvements in planning and organizing the mini-CEX.

2. Materials and methods

2.1. Background

At the National University of Singapore (NUS) Yong Loo Lin School of Medicine (NUS Medicine), a similar shift in assessment focus has occurred over the past 5 years to better prepare medical undergraduates for clinical work. Tools such as skills-based objective structured clinical examinations (OSCEs) and practice-based written assessments have replaced traditional theory-based assessment formats. The different tools serve dissimilar purposes. For instance, OSCEs and the mini-CEX were designed specifically to assess healthcare professionals in the clinical workplace. The mini-CEX is a recently incorporated assessment tool at NUS Medicine for both formative and summative assessments during postings in the clinical years, particularly in Phase III (Year 3). The number of sessions and the weightage for each posting in Phase III to Phase V (Years 3–5) differ slightly, and they form part of the students’ posting scores. Overall, the mini-CEX contributes 18% in Phase III, 14% in Phase IV, and 8% in Phase V of the total grades from clinical postings, except for all surgery postings where the mini-CEX is formative. There are a larger number of assessment points and hence more sessions in Phase III as most of the postings are long (6–8 weeks) and students are more involved in clinical work. Presently, students are briefed about the assessment processes, including the mini-CEX, by the phase coordinator at the start of each phase, and at the beginning of each posting by an undergraduate education director. Mini-CEX training is offered to all the tutors involved, but because of the short time frame, not all of them have completed it. Additionally, there are frequently new assessors, and trained tutors often move to other hospitals or the private sector, all of which leads to a significant number of non-trained assessors. To circumvent this, an online module was introduced for assessors to learn more about the mini-CEX at their own time and pace.

2.2. Study design

This cross sectional study was conducted in May 2013. Two hundred and seventy-five senior medical undergraduates in NUS Medicine and 121 clinical tutors from all seven major clinical teaching hospitals in Singapore (Alexandra Hospital, Changi General Hospital, Khoo Teck Puat Hospital, KK Women and Children's Hospital, National University Hospital, Singapore General Hospital, and Tan Tock Seng Hospital) took part in the study.

2.3. Inclusion and exclusion criteria

NUS medical students in Phase III and Phase IV (academic year 2012/2013) were included in the study. The inclusion criteria for tutors were that they were doctors of registrar rank and above who were involved in conducting the mini-CEX for NUS Medicine students. Tutors had to be working in one of the aforementioned restructured hospitals, and be employed in a department through which Phase III and Phase IV students rotated (i.e., anesthesia, emergency medicine, general surgery, internal medicine, obstetrics and gynecology, ophthalmology, orthopedic surgery, otolaryn-gology, pediatrics, and psychological medicine). Clinical tutors who had no experience in conducting the mini-CEX in a clinical learning environment were excluded.

2.4. Questionnaire design

An original questionnaire was developed based on the contemporary published literature on workplace-based assessment. Although similar studies have been done abroad, none of the questionnaires have been cross-validated across nations [56]. Questionnaires were self-administered and completely anonymous. There were no identifiers except for three questions for the students (gender, phase of study, and academic grade) and two questions for the tutors (institution and department) to ensure that the inclusion criteria were fulfilled. The main body of the questionnaire comprised 26 and 24 close-ended questions on the perceived strengths and limitations of the mini-CEX for tutors and students, respectively. Questions were divided into four sections: (1) assessment format; (2) clinical tutors/assessors; (3) medical students; and (4) clinical learning environment. Participants rated each item using a four-point Likert-type scale (1 = “strongly disagree”, 2 = “disagree”, 3 = “agree”, = “strongly agree”). There were also three open-ended questions on the strengths, limitations, and suggested improvements for the mini-CEX.

25. Pilot study

A pilot study was conducted using an online survey tool (Survey Monkey). Thirty-four Phase III students, 36 Phase IV students, and 26 clinical tutors were surveyed with the same inclusion criteria. The initial questionnaire was refined based on results and feedback from the pilot study. The final questionnaire was administered in hard copy, as the pilot study revealed a poor response rate to the online survey.

2.6. Data collection

Student questionnaires were administered immediately after a teaching–learning activity to facilitate questionnaire distribution and maximize the response rate. Tutor questionnaires were disseminated via the Associate Dean's Office (ADO) at the seven restructured hospitals. The hospital ADOs distributed the questionnaires to the various department secretaries, who then passed them on to the clinical tutors. The completed tutor questionnaires were returned to the ADOs and subsequently mailed back to the Centre for Medical Education (CenMED), NUS Medicine.

2.7. Ethical considerations

The NUS Institutional Review Board approved the study in January 2013.

2.8. Data analysis

The results were analyzed using SPSS version 20.0 (IBM Corporation: Armonk, New York, United States of America). Univariate analysis was used to determine the prevalence and proportions of responses for the student markers (gender, year of study, and academic grade). Comparisons between the student and tutor responses were made using univariate analysis on each item via the chi-square test, and a p value < 0.05 was considered significant. For each of the three open-ended questions, free text responses were coded followed by a thematic analysis. Quotations which best illustrated the key consensus were selected and presented.

3. Results

3.1. Study population and participation rate

From the 523 NUS Medicine students deemed eligible to participate in the study (having been assessed via mini-CEX), 302 attempted the questionnaire. Only 275 completed questionnaires were included in our analysis. The overall student participation rate was 52.6%. Phase III and Phase IV students completed 73.5% and 26.5% of the questionnaires, respectively. One hundred and twenty-one clinical tutors completed and returned the questionnaire (response rate approximately 70%). The breakdown was Alexandra Hospital (4.1%), Changi General Hospital (20.7%), KhooTeck Puat Hospital (17.4%), KK Women and Children's Hospital (9.9%), National University Hospital (19.0%), Singapore General Hospital (10.7%), and Tan Tock Seng Hospital (18.2%). The full results can be found in Appendix 1.
Appendix 1
Assessment format

No.Items (same question for students and tutors unless otherwise stated)Strongly disagree and disagree Students (%)Strongly agree and agree Students (%)Strongly disagree and disagree Tutors (%)Strongly agree and agree Tutors (%)Mean StudentsMean Tutors
1.I feel that there are too many clinical assessments.59.440.563.037.02.452.39
2The purpose and objectives of the assessment are clear to both the teacher and the student.32.267.728.171.92.712.77
3There is enough feedback from teacher to student.39.161.024.076.02.642.82
4Feedback is inconsistent when given by multiple sources.40.559.551.748.32.732.49
5The assessment promotes selective, test-oriented behavior (instead of habitual performance).26.773.329.470.62.832.79
6Performing well in the clinical assessment will correlate with future clinical competency.40.459.644.555.42.602.56
7I feel that the following method of assessment is effective for medical undergraduates for learning in advanced years, or to prepare myself for future practice as a doctor. (students)18.781.420.080.02.882.88
I feel that the following method of assessment is effective for medical undergraduates for learning in advanced years, or to prepare them for future practice as a doctor. (tutors)

Clinical tutors/assessors

No.Items (same question for students and tutors unless otherwise stated)Strongly disagree and disagree Students (%)Strongly agree and agree Students (%)Strongly disagree and disagree Tutors (%)Strongly agree and agree Tutors (%)Mean StudentsMean Tutors

8I feel that my tutors have received adequate training to administer the assessment. (students)21.678.428.971.12.892.75
I have received adequate training to administer the assessment. (tutors)
9My tutors’ assessment of me will be influenced by previous interactions with them (e.g., ward work, tutorials). (students)26.473.540.859.22.892.49
My assessment of the student will be influenced by previous interactions with him (e.g., ward work, tutorials). (tutors)
10My tutors tend to pitch my performance against that of a senior student or resident. (students)47.852.283.516.62.612.01
I tend to pitch the student's performance against that of a senior student or resident. (tutors)
11I am afraid of upsetting the student and/or damaging the student-doctor relationship. (tutors)77.722.42.04
12My tutors give specific and actionable feedback (i.e., specific tasks that I should undertake to improve). (students)28.771.375.025.02.752.11
I find it difficult to give specific and actionable feedback (i.e., specific tasks that the student should undertake to improve). (tutors)
13My tutors have enough time in their schedule to conduct clinical assessments with me. (students)50.249.862.038.02.452.25
There is enough time in my schedule to conduct clinical assessments with the students. (tutors)
14My tutors are confident in assessing undergraduate medical students. (students)10.090.05.894.23.043.06
I feel confident assessing undergraduate medical students. (tutors)
15I would like to give feedback to my tutors on how they have conducted the assessment process. (students) I would like to receive feedback from the students on how I have conducted the assessment process. (tutors)34.865.213.386.82.713.10

My tutors grade me solely based on the numerical scale, with 1 being the lowest and 9 being the highest mark. (students)My tutors grade me first based on the categorical grading (e.g., “Meets Expectations”), then subsequently award me a numerical grade within that category (e.g., 4–6). (students)I grade my students solely based on the numerical scale, with 1 being the lowest and 9 being the highest mark. (tutors)I grade my students first based on the categorical grading (e.g., “Meets Expectations”), then subsequently awarding them a numerical grade within that category (e.g. 4–6). (tutors)

16Please select the most commonlyused method of assessment for you:36.663.422.777.3

Yes (students)No (students)Yes (tutors)No (tutors)

17A score of 7 out of 9 directly translates to a numerical score of 77.8%.66.034.050.949.1
18My tutors are comfortable with awarding me “Exceeds Expectations” if they feel that I deserve it. (students)54.046.096.73.3
I am comfortable with awarding my students “Exceeds Expectations” if I feel that they deserve it. (tutors)

Medical students

No.Items (same question for students and tutors unless otherwise stated)Strongly disagree and disagree Students (%)Strongly agree and agree Students (%)Strongly Disagree and Disagree Tutors (%)Strongly agree and agree Tutors (%)Mean StudentsMean Tutors

19I am able to evaluate my own performance. (students)29.870.244.255.92.742.56
I feel that students are able to self-reflect and evaluate their own performance. (tutors)
20I am comfortable with asking for feedback from my tutors. (students)28.971.245.454.62.752.55
I feel that students are comfortable with asking for feedback. (tutors)
21I find that I may be resistant or defensive when receiving negative feedback. (students)70.829.237.762.42.192.67
The student may be resistant or defensive when receiving negative feedback. (tutors)
22(This question is for Phase IV and Phase V students only)16.983.161.438.62.962.35
I find that I am more receptive towards feedback now as compared to when I was a junior medical student. (students)
Senior medical students (e.g., Phase IV and Phase V) are more receptive towards clinical feedback as compared to junior medical students. (tutors)
23I feel that I tend to “put on a show” for the clinical assessment (i.e., this performance is not reflective of my daily clinical behavior). (students)42.157.841.758.32.702.62
I feel that students tend to “put on a show” for the clinical assessment (i.e., this performance is not reflective of their daily clinical behavior). (tutors)

Clinical learning environment

No.Items (same question for students and tutors unless otherwise stated)Strongly disagree and disagree Students (%)Strongly agree and agree Students (%)Strongly disagree and disagree Tutors (%)Strongly agree and agree Tutors (%)Mean StudentsMean Tutors

24My tutors are able to find an appropriate time and place for a formal feedback session. (students)42.957.140.060.02.542.57
I can find an appropriate time and place for a formal feedback session. (tutors)
25I tend to avoid appearing critical, especially in the presence of patients or medical colleagues. (tutors)26.073.92.80

WardClinicTutorial roomWard + clinicAll 3 venues

26  My tutors normally conduct the mini-CEX in the: (students)73.613.62.36.04.5
  I normally conduct the mini-CEX in the: (tutors)63.229.12.63.31.7

A = agree; D = disagree; SA = strongly agree; SD = strongly disagree.

A = agree; D = disagree; SA = strongly agree; SD = strongly disagree.

3.1.1. Mini-CEX: strengths

Table 1 shows the items which both tutors and students agreed were strengths of the mini-CEX.
Table 1

Comparison between tutors and students on the strengths in Mini-Clinical Evaluation Exercise (mini-CEX).

Items (Same question for students and tutors unless otherwise stated)StudentsTutorsMean StudentsMean Tutors


SD, DSA, ASD, DSA, A
The purpose and objectives of the assessment are clear to both the teacher and the student32.267.728.171.92.712.77
I feel that there are too many clinical assessments59.440.563.037.02.452.39
The assessment promotes selective, test-oriented behavior instead of habitual performance26.773.329.470.62.832.79
My tutors are confident in assessing undergraduate medical students (Student's questionnaire)10.090.05.894.23.043.06
I feel confident assessing undergraduate medical students (Tutor's questionnaire)
I feel that the following method of assessment is effective for medical undergraduates for learning in advanced years, or to prepare myself for future practice as a doctor18.781.420.080.02.882.88

A = agree; D = disagree; SA = strongly agree; SD = strongly disagree.

Comparison between tutors and students on the strengths in Mini-Clinical Evaluation Exercise (mini-CEX). A = agree; D = disagree; SA = strongly agree; SD = strongly disagree. 3.1.1.1. The mini-CEX is an effective assessment tool. Both students (81.3%) and tutors (80.0%) felt that the mini-CEX was an effective assessment tool to evaluate medical undergraduates, especially in their senior clinical years, as well as to prepare them for future clinical practice. Additionally, the free text responses showed that both students and tutors agreed that the mini-CEX assessed essential clinical skills and provided an avenue for immediate feedback and correction of mistakes. Below are some of the responses given by tutors and students: The mini-CEX is “an efficient method to assess the student in terms of history taking and physical examination, and to detect weaknesses [in these domains]” [The mini-CEX] serves as an “opportunity for mistakes to be picked (up) easily and knowledge gaps (to be) addressed earlier” [The mini-CEX] is “good for evaluation and feedback, and is an avenue to correct mistakes” “[As a student], the mini-CEX is good practice for picking up signs” “[The mini-CEX] is exam-oriented and requires a good range of overall knowledge and clinical skills” 3.1.1.2. Feedback is a key component of the mini-CEX. Both tutors and students agreed that one of the greatest strengths of the mini-CEX was its ability to provide students with immediate feedback and timely correction of mistakes. In addition, 83.1% of Phase IV students–who were separately questioned–felt that they were now more receptive to feedback compared with when they were junior medical students. 3.1.1.3. Tutors are confident in conducting the mini-CEX. A total of 90.1% of students and 94.2% of tutors felt that the clinical tutors were confident in assessing undergraduate medical students. Among the tutors, 96.7% were comfortable with awarding the “exceeds expectations” grade if they felt that the student deserved it.

3.1.2. Mini-CEX: Limitations and challenges

Table 2 shows the items which both tutors and students agreed were limitations of the mini-CEX.
Table 2

Comparison between tutors and students on the limitations of Mini-Clinical Evaluation Exercise (mini-CEX).

Items (Same question for students and tutors unless otherwise stated)StudentsTutorsMean StudentsMean Tutors


SD, DSA, ASD, DSA, A
My tutors are able to find an appropriate time and place for a formal feedback session42.957.140.060.02.542.57
I feel that I tend to “put on a show” for the clinical assessment. (Student's questionnaire)42.157.841.758.32.702.62
I feel that students tend to “put on a show” for the clinical assessment. (Tutor's questionnaire)

A = agree; D = disagree; SA = strongly agree; SD = strongly disagree.

Comparison between tutors and students on the limitations of Mini-Clinical Evaluation Exercise (mini-CEX). A = agree; D = disagree; SA = strongly agree; SD = strongly disagree. The most commonly stated limitations of the mini-CEX, as seen from both the questionnaire and free-text responses, were inter-tutor variability and a general lack of time to conduct the assessment, as well as inappropriate interpretation of the mini-CEX grade. 3.1.2.1. Inter-tutor variability. Inter-tutor variability was the most pressing problem noted with the assessment process, and was the top response in the free text analysis (independently quoted by 86 students and 10 tutors). This was further supported by the fact that the most frequently cited suggestion for improvements to the mini-CEX was to reduce inter-tutor variability. In addition, although the mini-CEX was designed to be an objective assessment tool for use only at a specified time and context, both students (73.5%) and tutors (59.2%) felt that previous student–tutor interactions influenced a student's mini-CEX grade. 3.1.2.2. Time available to conduct the mini-CEX. A dismal 48.0% of students and 38.0% of tutors found that tutors had sufficient time to conduct the mini-CEX. Lack of time was also the second most common limitation in the free text analysis (independently quoted by 68 participants). It was thus unsurprising that 18 students and 16 tutors (34 participants) suggested allocating protected time for the tutors to conduct the mini-CEX assessment outside of their regular clinical duties and ward work. 3.1.2.3. Interpretation of the mini-CEX grade. A considerable percentage of students (60.9%) and tutors (50.9%) inappropriately perceived that the mini-CEX was graded solely based on the numerical score. Thus, a student who met expectations and scored 4 out of 9 could potentially misinterpret his performance as that of a numerical failure. In addition, 52.5 % of students felt that tutors compared their performance against either a senior medical student or a junior doctor, although 83.5% of tutors disagreed with this. 3.1.2.4. Mini-CEX performance translating to future clinical practice. Among students, 57.9% admitted that they “put on a show” during their mini-CEX assessment. Similarly, 73.4% of students went on to agree that the mini-CEX promotes selective, test-oriented behavior instead of habitual performance.

3.1.3. Tutors’ and students’ perceptions of mini-CEX feedback

Interestingly, there are some inconsistencies in the ratings of certain items between tutors and learners. These items are shown in Table 3 and mainly focus on feedback in the mini-CEX.
Table 3

Items with wide mean differences between tutors and students in Mini-Clinical Evaluation Exercise (mini-CEX).

Items (same question for students and tutors unless otherwise stated)StudentsTutorsMean StudentsMean TutorsMean difference


SD, DSA, ASD, DSA, A
I feel that I am more receptive towards feedback now as compared to when I was a junior medical student. (Student's questionnaire)16.983.161.438.62.962.35−0.60
Senior medical students (e.g., Phase IV and Phase V) are more receptive towards clinical feedback as compared to junior medical students. (Tutor's questionnaire)
My tutors tend to pitch my performance against that of a senior student or resident. (Student's questionnaire)47.852.283.516.62.612.01−0.60
I tend to pitch the student's performance against that of a senior student or resident. (Tutor's questionnaire)
My tutor's assessment of me will be influenced by previous interactions with them. (Student's questionnaire)26.473.540.859.22.892.49−0.39
My assessment of the student will be influenced by previous interactions with him. (Tutor's questionnaire)
I am comfortable with asking for feedback from my tutors. (Student's questionnaire)28.971.245.454.62.752.55−0.21
I feel that students are comfortable with asking for feedback. (Tutor's questionnaire)
I am able to evaluate my own performance. (Student's questionnaire)29.870.244.255.92.742.56−0.18
I feel that students are able to self-reflect and evaluate their own performance. (Tutor's questionnaire)
I find that I may be resistant or defensive when receiving negative feedback. (Student's questionnaire)70.829.237.762.42.192.67+0.48
The student may be resistant or defensive when receiving negative feedback. (Tutor's questionnaire)
I would like to give feedback to my tutors on how they have conducted the assessment process. (Student's questionnaire)34.865.213.386.82.713.10+0.39
I would like to receive feedback from the students on how I have conducted the assessment process. (Tutor's questionnaire)
Items with wide mean differences between tutors and students in Mini-Clinical Evaluation Exercise (mini-CEX). The relevance of the mini-CEX as a formative assessment tool hinges on the opportunity for tutors to provide students with timely feedback. However, while 71.2% of students felt that they were comfortable asking for feedback, only 54.6% of tutors observed this in their students. Next, there were also differences in the reactions to the feedback given. While 70.2% of students felt that they were able to reflect on their mini-CEX performance, only 55.8% of tutors agreed with this opinion. This is reflected in the observation. Also, 63.4% of tutors felt that students were defensive or resistant to receiving negative feedback, although 70.8% of students denied harboring such resentment. It was surprising that 76.0% of students felt that they were receiving specific and actionable feedback, when only 60.9% of tutors felt that they were giving such feedback. Additionally, 59.5% of students felt that feedback was inconsistent across multiple sources, although only 48.3% of tutors agreed with this. Finally, feedback should be a two-way process in clinical assessment. Among tutors, 86.8% would have liked to have feedback from their students on how the assessment was conducted, and how they could have improved the learning experience for the next student. However, only 65.2% of students felt willing to give such feedback.

4. Discussion

This study systemically evaluated the strengths, challenges, and limitations of, as well as feedback from the mini-CEX from both the students’ and tutors’ perspectives in an undergraduate setting in Singapore. Besides supporting literature findings on the strengths of the mini-CEX, we identified several potential limitations associated with its use, which may be prevalent at other institutions. We will discuss how we can circumvent these issues, which may be beneficial for other schools. Our results validated the usefulness of the mini-CEX, which can be applied in a broad variety of clinical settings and thus mirrors real clinical situations closely [7]. Our results show that both students and teachers perceive the mini-CEX to be an effective assessment tool that prepares medical students for future practice as doctors. Both groups agreed that it assesses key clinical skills, similar to what other authors have found, as it evaluates the trainee's clinical competence in areas like attitudes and clinical skills as well as behaviors [8]. However, we found major limitations associated with two areas–feedback and the grading process. There were several misconceptions by both tutors and students with respect to the theoretical construct of the mini-CEX, particularly the grading process. For instance, there was the issue of inter-tutor variability, although this (especially in the case of very strict examiners) may be partly alleviated by the fact that the students were evaluated on multiple occasions by different examiners in varying settings [9]. This limitation could also be due to the increased use of the mini-CEX as a summative assessment tool without ensuring a satisfactory rigor in assessor training and performance discussion. Another notable challenge we found is the halo effect, where a tutor's previous interaction with students influences the minimum score, as agreed on by both students and tutors. This provides an element of subjectivity which further contributes to inter-tutor variability. Indeed, Hill et al [10] reported that the tutors found it hard to award a poor rating to those with whom they had worked closely. The mini-CEX was originally developed to assess students based on three distinct categories: (1) does not meet expectations (1 –3 points); (2) meets expectations (4–6 points); and (3) exceeds expectations (7–9 points). Tutors were instructed to first decide on the category, and then award a corresponding numerical score. However, our findings suggested that most of the tutors and students still inappropriately perceived that the mini-CEX was graded solely based on the numerical score. Overly focusing on the numerical score may have contributed to the use of the mini-CEX as a summative tool. This drives both the students’ and clinical teachers’ behaviors to achieve a higher score rather than focusing on identifying gaps and providing effective feedback to improve students’ training. This is evident from the results that students were “putting on a show” and that the mini-CEX promoted test-oriented behavior. Certainly, this has restricted its usefulness as an educational tool, especially with regard to “assessment for learning” [10]. A significant number of students could be “putting on a show” as these assessments were summative and hence contributed to their examination scores. This would defeat the purpose of the mini-CEX as a tool that assesses daily clinical practice. This problem could possibly be overcome by implementation of the mini-CEX primarily as a formative instead of a summative assessment tool. The other key challenge is the area of feedback. Feedback provision in a formative manner has a positive educational impact on trainees’ performance and learning, and examiners felt that this is a key component of clinical assessment [111213]. However, our findings revealed that students may not be receptive to the tutors’ advice, particularly negative comments. In addition, tutors may withhold negative feedback for fear of damaging the student–tutor relationship and instilling resentment in the student [14]. Good quality feedback may also be lacking, with most tutor-to-student feedback often being too generalized and thus not helpful to a learner seeking to improve performance [15]. This is consistent with other reports where provision of feedback is often less than the desired level. Even when it was provided, action plans were often found to be lacking particularly among clinicians who do not teach as much relative to the faculty [416]. It is significant to note that the largest difference in opinion between tutors and students was seen in questions related to the feedback process. The ability to handle negative feedback and constructive criticism requires a certain amount of maturity [17]. This suggests that it may be more beneficial to implement the mini-CEX only in the senior clinical years. The ability to self-reflect via Pendleton's model may also be lacking in local students [18]. This may limit the effectiveness of feedback to the students. As the students mature as they progress through their medical journey, they may become more familiar with the clinical learning environment as well as the clinical teachers in the system. This would allow them to build up their confidence and become more confident in engaging their clinical teachers in more feedback and questions. This research highlighted the great scarcity in understanding feedback between learners and assessors in the mini-CEX. If feedback were to benefit the learners in improving their performance, it is crucial to understand the perspectives from both assessors and learners in order to provide more constructive feedback. This article fills in the gap in the existing literature on the issues to be taken into consideration when giving feedback to learners. Having noted these challenges, we believe that the proper training of clinical teachers in effective use of the mini-CEX as a tool for “assessment for learning”, as well as clarity in purposeful discussions with students, could help to solve most of the limitations. Currently, mini-CEX assessment training programs are conducted regularly by CenMED, in addition to ad hoc on-site training for the faculty at the different training institutions. However, these sessions are voluntary and not all the tutors have attended mini-CEX training. This is especially so as not all postings assign a mini-CEX assessor, and students are hence free to pick whichever senior doctor theyare most comfortable with. Next, our students are briefed on the rationale and how students from previous cohorts have performed in general, which makes them feel more at ease. For higher transparency, they are also assured that tutors assess them at an appropriate academic level and compare their performance only against those of their peers. This could reduce their perceptions that they are graded by comparison with a more senior learner. At the institution level, mini-CEX should be incorporated as a formative tool and structured programmes as well as more opportunities could be given for training of clinical teachers. One student requested that the school “please brief tutors on the expectations or requirements of students and standardize the scoring range and feedback”; similarly, a tutor felt that “the mini-CEX is probably the best tool, but as a teacher I will need more briefing and more time to conduct it properly”. In light of this, a key group of “clinician-educators” who will eventually become mentors could be trained, while the rest could undergo an interactive training program similar to the “direct observation of competence training” proposed by Holmboe [19] and Holmboe et al [20]. Focus could be placed in the areas of feedback and standardization of the grading process, and these improvements could go a long way in reducing the limitations encountered. Considering the challenges with the mini-CEX, there is a move by the Faculty Assessment Committee to relook at the mini-CEX as a more formative tool or with less percentage of marks allocated in future. Additionally, to circumvent the problem of students and tutors feeling that there is insufficient time to conduct the mini-CEX, the higher management such as the Dean and department heads could give higher priority to the evaluation of clinical skills by faculty [19]. This was the second most common limitation as the clinicians involved in the mini-CEX are often involved in clinical work and research, and do not have enough protected time to conduct mini-CEX sessions. This study did have limitations. We only included Phase III and Phase IV students and the latter's response rate was considerably lower than the former, which might be attributed to the fact that most mini-CEX sessions are conducted in Phase III. The cross-sectional study design may also be less robust at a time when undergraduate medical assessment is changing constantly and rapidly. Further studies may look to obtain longitudinal data and assess how perceptions change over time, as well as if there are improvements in the mini-CEX administration with increased tutor training and more protected time. Also, students’ performance data could be studied. In conclusion, this study not only further validates the usefulness of the mini-CEX as a core clinical assessment tool, but also identifies key challenges that limit its effectiveness in the clinical learning environment. Suggestions to circumvent the challenges were proposed and we hope that these observations can serve as a platform from which more improvements to formal assessment can be made in undergraduate medical education. Medical and health professional schools across the globe are increasingly shifting their focus to incorporate newer assessment formats such as workplace-based assessment. Identifying and recognizing these challenges is important to avoid the misuse of effective assessment resources to drive students’ learning behaviors.
  18 in total

1.  Faculty and the observation of trainees' clinical skills: problems and opportunities.

Authors:  Eric S Holmboe
Journal:  Acad Med       Date:  2004-01       Impact factor: 6.893

2.  Developing the teaching instinct, 1: feedback.

Authors:  E A Hesketh; J M Laidlaw
Journal:  Med Teach       Date:  2002-05       Impact factor: 3.650

3.  Assessing the reliability and validity of the mini-clinical evaluation exercise for internal medicine residency training.

Authors:  Steven J Durning; Lannie J Cation; Ronald J Markert; Louis N Pangaro
Journal:  Acad Med       Date:  2002-09       Impact factor: 6.893

4.  Maturational differences in undergraduate medical students' perceptions about feedback.

Authors:  Deborah Murdoch-Eaton; Joan Sargeant
Journal:  Med Educ       Date:  2012-07       Impact factor: 6.251

5.  Identifying the factors that determine feedback given to undergraduate medical students following formative mini-CEX assessments.

Authors:  Nishan Fernando; Jennifer Cleland; Hamish McKenzie; Kevin Cassar
Journal:  Med Educ       Date:  2007-11-22       Impact factor: 6.251

6.  Investigation of trainee and specialist reactions to the mini-Clinical Evaluation Exercise in anaesthesia: implications for implementation.

Authors:  J M Weller; A Jones; A F Merry; B Jolly; D Saunders
Journal:  Br J Anaesth       Date:  2009-08-17       Impact factor: 9.166

7.  What's the problem with the mini-CEX?

Authors:  Marja Dijksterhuis; Lambert Schuwirth; Didi Braat; Fedde Scheele
Journal:  Med Educ       Date:  2011-03       Impact factor: 6.251

8.  Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference.

Authors:  John Norcini; Brownell Anderson; Valdes Bollela; Vanessa Burch; Manuel João Costa; Robbert Duvivier; Robert Galbraith; Richard Hays; Athol Kent; Vanessa Perrott; Trudie Roberts
Journal:  Med Teach       Date:  2011       Impact factor: 3.650

9.  The mini clinical evaluation exercise (mini-CEX) for assessing clinical performance of international medical graduates.

Authors:  Balakrishnan R Nair; Heather G Alexander; Barry P McGrath; Mulavana S Parvathy; Eve C Kilsby; Johannes Wenzel; Ian B Frank; George S Pachev; Gordon G Page
Journal:  Med J Aust       Date:  2008-08-04       Impact factor: 7.738

10.  Effects of training in direct observation of medical residents' clinical competence: a randomized trial.

Authors:  Eric S Holmboe; Richard E Hawkins; Stephen J Huot
Journal:  Ann Intern Med       Date:  2004-06-01       Impact factor: 25.391

View more
  4 in total

1.  Perception of Pathology of Otolaryngology-Related Subjects: Students' Perspective in an Innovative Multidisciplinary Classroom.

Authors:  Ihab Shafek Atta; Rajab A Alzahrani
Journal:  Adv Med Educ Pract       Date:  2020-05-29

2.  Perception and Satisfaction of Undergraduate Medical Students of the Mini Clinical Evaluation Exercise Implementation in Orthopedic Outpatient Setting.

Authors:  Abdulaziz Z Alomar
Journal:  Adv Med Educ Pract       Date:  2022-09-23

3.  The educational impact of Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) and its association with implementation: A systematic review and meta-analysis.

Authors:  Andrea C Lörwald; Felicitas-Maria Lahner; Zineb M Nouns; Christoph Berendonk; John Norcini; Robert Greif; Sören Huwendiek
Journal:  PLoS One       Date:  2018-06-04       Impact factor: 3.240

4.  Chinese doctors' views on workplace-based assessment: trainee and supervisor perspectives of the mini-CEX.

Authors:  Yuying Liang; Lorraine M Noble
Journal:  Med Educ Online       Date:  2021-12
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.