Literature DB >> 22803755

Can knowledge tests and situational judgement tests predict selection centre performance?

Haroon Ahmed1, Melody Rhydderch, Phil Matthews.   

Abstract

OBJECTIVES: Written tests are an integral part of selection into general practice specialty training in the UK. Evidence supporting their validity and reliability as shortlisting tools has prompted their introduction into the selection processes of other medical specialties. This study explores whether candidate performance on two written tests predicts performance on subsequent workplace-based simulation exercises.
METHODS: A prospective analysis of candidate performance (n = 135) during the general practice selection process was undertaken. Candidates were shortlisted using their scores on two written tests, a clinical problem-solving test (CPST) and a situational judgement test (SJT). Successful candidates then undertook workplace-based simulation exercises at a selection centre (SC). Scores on the CPST and SJT were correlated with SC scores. Regression analysis was undertaken to explore the predictive validity of the CPST and SJT for SC performance.
RESULTS: The data show that the CPST and SJT are predictive of performance in workplace-based simulations (r = 0.598 for the CPST, r = 0.717 for the SJT). The SJT is a better predictor of SC performance than the CPST (R(2) = 0.51 versus R(2) = 0.35). However, the two tests together provide the greatest degree of predictive ability, accounting for 57% of the variance seen in mean scores across SC exercises.
CONCLUSIONS: The CPST and SJT play valuable roles in shortlisting and are predictive of performance in workplace-based SC exercises. This study provides evidence for their continued use in selection for general practice training and their expansion to other medical specialties. © Blackwell Publishing Ltd 2012.

Mesh:

Year:  2012        PMID: 22803755     DOI: 10.1111/j.1365-2923.2012.04303.x

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  5 in total

1.  A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach.

Authors:  Michael J Peeters; Varun A Vaidya
Journal:  Am J Pharm Educ       Date:  2016-06-25       Impact factor: 2.047

2.  A survey-based cross-sectional study of doctors' expectations and experiences of non-technical skills for Out of Hours work.

Authors:  Michael Brown; Dominick Shaw; Sarah Sharples; Ivan Le Jeune; John Blakey
Journal:  BMJ Open       Date:  2015-02-16       Impact factor: 2.692

3.  The social validity of a national assessment centre for selection into general practice training.

Authors:  Annette Burgess; Chris Roberts; Tyler Clark; Karyn Mossman
Journal:  BMC Med Educ       Date:  2014-12-21       Impact factor: 2.463

4.  The validity of a behavioural multiple-mini-interview within an assessment centre for selection into specialty training.

Authors:  Chris Roberts; Tyler Clark; Annette Burgess; Michael Frommer; Marcia Grant; Karyn Mossman
Journal:  BMC Med Educ       Date:  2014-08-13       Impact factor: 2.463

5.  Multiple mini interview (MMI) for general practice training selection in Australia: interviewers' motivation.

Authors:  Annette Burgess; Chris Roberts; Premala Sureshkumar; Karyn Mossman
Journal:  BMC Med Educ       Date:  2018-01-25       Impact factor: 2.463

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.