Literature DB >> 21995509

Workplace-based assessments of junior doctors: do scores predict training difficulties?

Colin Mitchell1, Sarita Bhat, Anne Herbert, Paul Baker.   

Abstract

OBJECTIVES: Workplace-based assessment (WPBA) is an increasingly important part of postgraduate medical training and its results may be used as evidence of professional competence. This study evaluates the ability of WPBA to distinguish UK Foundation Programme (FP) doctors with training difficulties and its effectiveness as a surrogate marker for deficiencies in professional competence.
METHODS: We conducted a retrospective observational study using anonymised records for 1646 trainees in a single UK postgraduate deanery. Data for WPBAs conducted from August 2005 to April 2009 were extracted from the e-portfolio database. These data included all scores submitted by trainees in FP years 1 and 2 on mini-clinical evaluation exercise (mini-CEX), case-based discussion (CbD), direct observation of procedural skills (DOPS) and mini-peer assessment tool (mini-PAT) assessments. Records of trainees in difficulty, as identified by their educational supervisors, were tagged as index cases. Main outcome measures were odds ratios (ORs) for associations between mean WPBA scores and training difficulties. Further analyses by the reported aetiology of the training difficulty (health-, conduct- or performance-related) were performed.
RESULTS: Of the 1646 trainees, 92 had been identified as being in difficulty. Mean CbD and mini-CEX scores were lower for trainees in difficulty and an association was found between identified training difficulties and average scores on the mini-CEX (OR = 0.54; p = 0.034) and CbD (OR = 0.39; p = 0.002). A receiver operator characteristic curve analysis of mean WPBA scores for diagnosing 'in difficulty' status yielded an area under the curve of 0.64, indicating weak predictive value. There was no statistical evidence that mean scores on DOPS and mini-PAT assessments differed between the two groups.
CONCLUSIONS: Analysis of a large dataset of WPBA scores revealed significant associations between training difficulties and lower mean scores on both the mini-CEX and CbD. Models show that using WPBA scores is, however, not a valid way of screening for trainees in difficulty. Workplace-based assessments have value as formative assessments that prompt supervision, feedback and reflection. They should not be relied upon to certify competence and their use for such ends may reduce their effectiveness in training. Their results should be interpreted in the context of multiple other methods of assessment, with the aim of achieving a genuinely holistic and representative assessment of professional competence. © Blackwell Publishing Ltd 2011.

Mesh:

Year:  2011        PMID: 21995509     DOI: 10.1111/j.1365-2923.2011.04056.x

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  14 in total

1.  Validity and Feasibility of the Minicard Direct Observation Tool in 1 Training Program.

Authors:  Anthony A Donato; Yoon Soo Park; David L George; Alan Schwartz; Rachel Yudkowsky
Journal:  J Grad Med Educ       Date:  2015-06

Review 2.  The modern surgeon and competency assessment: are the workplace-based assessments evidence-based?

Authors:  K M Torsney; D M Cocker; A A P Slesser
Journal:  World J Surg       Date:  2015-03       Impact factor: 3.352

3.  Evaluation of feedback given to trainees in medical specialties.

Authors:  Tony Ck Tham; Bill Burr; Mairead Boohan
Journal:  Clin Med (Lond)       Date:  2017-07       Impact factor: 2.659

4.  Foundation doctors' experience of their training: a questionnaire study.

Authors:  Benjamin J F Dean; Philip Michael Duggleby
Journal:  JRSM Short Rep       Date:  2013-01-14

5.  Sustained effect of simulation-based ultrasound training on clinical performance: a randomized trial.

Authors:  M G Tolsgaard; C Ringsted; E Dreisler; L N Nørgaard; J H Petersen; M E Madsen; N L C Freiesleben; J L Sørensen; A Tabor
Journal:  Ultrasound Obstet Gynecol       Date:  2015-08-06       Impact factor: 7.299

6.  DOPS (Direct Observation of Procedural Skills) in undergraduate skills-lab: Does it work? Analysis of skills-performance and curricular side effects.

Authors:  Christoph Profanter; Alexander Perathoner
Journal:  GMS Z Med Ausbild       Date:  2015-10-15

7.  A preliminary investigation to explore the cognitive resources of physicians experiencing difficulty in training.

Authors:  Fiona Patterson; Fran Cousans; Iain Coyne; Jo Jones; Sheona Macleod; Lara Zibarras
Journal:  BMC Med Educ       Date:  2017-05-15       Impact factor: 2.463

8.  Direct observation of procedural skills (DOPS) evaluation method: Systematic review of evidence.

Authors:  Masoumeh Erfani Khanghahi; Farbod Ebadi Fard Azar
Journal:  Med J Islam Repub Iran       Date:  2018-06-03

9.  The Effect of Repeated Direct Observation of Procedural Skills (R-DOPS) Assessment Method on the Clinical Skills of Anesthesiology Residents.

Authors:  Shideh Dabir; Mohammad Hoseinzadeh; Faramarz Mosaffa; Behnam Hosseini; Mastaneh Dahi; Maryam Vosoughian; Mohammadreza Moshari; Soodeh Tabashi; Ali Dabbagh
Journal:  Anesth Pain Med       Date:  2021-01-24

10.  Assessment methods in surgical training in the United Kingdom.

Authors:  Evgenios Evgeniou; Loizou Peter; Maria Tsironi; Srinivasan Iyer
Journal:  J Educ Eval Health Prof       Date:  2013-02-05
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.