Literature DB >> 19335574

Implementing the undergraduate mini-CEX: a tailored approach at Southampton University.

Faith Hill1, Kathleen Kendall, Kevin Galbraith, Jim Crossley.   

Abstract

OBJECTIVES: The mini-clinical evaluation exercise (mini-CEX) is widely used in the UK to assess clinical competence, but there is little evidence regarding its implementation in the undergraduate setting. This study aimed to estimate the validity and reliability of the undergraduate mini-CEX and discuss the challenges involved in its implementation.
METHODS: A total of 3499 mini-CEX forms were completed. Validity was assessed by estimating associations between mini-CEX score and a number of external variables, examining the internal structure of the instrument, checking competency domain response rates and profiles against expectations, and by qualitative evaluation of stakeholder interviews. Reliability was evaluated by overall reliability coefficient (R), estimation of the standard error of measurement (SEM), and from stakeholders' perceptions. Variance component analysis examined the contribution of relevant factors to students' scores.
RESULTS: Validity was threatened by various confounding variables, including: examiner status; case complexity; attachment specialty; patient gender, and case focus. Factor analysis suggested that competency domains reflect a single latent variable. Maximum reliability can be achieved by aggregating scores over 15 encounters (R = 0.73; 95% confidence interval [CI] +/- 0.28 based on a 6-point assessment scale). Examiner stringency contributed 29% of score variation and student attachment aptitude 13%. Stakeholder interviews revealed staff development needs but the majority perceived the mini-CEX as more reliable and valid than the previous long case.
CONCLUSIONS: The mini-CEX has good overall utility for assessing aspects of the clinical encounter in an undergraduate setting. Strengths include fidelity, wide sampling, perceived validity, and formative observation and feedback. Reliability is limited by variable examiner stringency, and validity by confounding variables, but these should be viewed within the context of overall assessment strategies.

Entities:  

Mesh:

Year:  2009        PMID: 19335574     DOI: 10.1111/j.1365-2923.2008.03275.x

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  21 in total

Review 1.  The modern surgeon and competency assessment: are the workplace-based assessments evidence-based?

Authors:  K M Torsney; D M Cocker; A A P Slesser
Journal:  World J Surg       Date:  2015-03       Impact factor: 3.352

Review 2.  Workplace-based Assessment; Applications and Educational Impact.

Authors:  Salman Yousuf Guraya
Journal:  Malays J Med Sci       Date:  2015-11

3.  Comparing Entrustable Professional Activity Scores Given by Faculty Physicians and Senior Trainees to First-Year Residents.

Authors:  Steven J Katz; Dennis Wang
Journal:  Cureus       Date:  2022-06-09

4.  The implementation of a mobile problem-specific electronic CEX for assessing directly observed student-patient encounters.

Authors:  Gary S Ferenchick; Jami Foreback; Basim Towfiq; Kevin Kavanaugh; David Solomon; Asad Mohmand
Journal:  Med Educ Online       Date:  2010-01-29

5.  Using systematically observed clinical encounters (SOCEs) to assess medical students' skills in clinical settings.

Authors:  George R Bergus; Jerold C Woodhead; Clarence D Kreiter
Journal:  Adv Med Educ Pract       Date:  2010-11-19

6.  A laboratory study on the reliability estimations of the mini-CEX.

Authors:  Alberto Alves de Lima; Diego Conde; Juan Costabel; Juan Corso; Cees Van der Vleuten
Journal:  Adv Health Sci Educ Theory Pract       Date:  2011-12-23       Impact factor: 3.853

Review 7.  The construct and criterion validity of the multi-source feedback process to assess physician performance: a meta-analysis.

Authors:  Ahmed Al Ansari; Tyrone Donnon; Khalid Al Khalifa; Abdulla Darwish; Claudio Violato
Journal:  Adv Med Educ Pract       Date:  2014-02-27

8.  Exploration of a possible relationship between examiner stringency and personality factors in clinical assessments: a pilot study.

Authors:  Yvonne Finn; Peter Cantillon; Gerard Flaherty
Journal:  BMC Med Educ       Date:  2014-12-31       Impact factor: 2.463

9.  Using cloud-based mobile technology for assessment of competencies among medical students.

Authors:  Gary S Ferenchick; David Solomon
Journal:  PeerJ       Date:  2013-09-17       Impact factor: 2.984

10.  Team Objective Structured Bedside Assessment (TOSBA) as formative assessment in undergraduate Obstetrics and Gynaecology: a cohort study.

Authors:  Richard P Deane; Pauline Joyce; Deirdre J Murphy
Journal:  BMC Med Educ       Date:  2015-10-09       Impact factor: 2.463

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.