Literature DB >> 26796200

Exploring examiner judgement of professional competence in rater based assessment.

Fiona L Naumann1, Stephen Marshall2, Boaz Shulruf3, Philip D Jones3.   

Abstract

Exercise physiology courses have transitioned to competency based, forcing Universities to rethink assessment to ensure students are competent to practice. This study built on earlier research to explore rater cognition, capturing factors that contribute to assessor decision making about students' competency. The aims were to determine the source of variation in the examination process and document the factors impacting on examiner judgment. Examiner judgement was explored from both a quantitative and qualitative perspective. Twenty-three examiners viewed three video encounters of student performance on an OSCE. Once rated, analysis of variance was performed to determine where the variance was attributed. A semi-structured interview drew out the examiners reasoning behind their ratings. Results highlighted variability of the process of observation, judgement and rating, with each examiner viewing student performance from different lenses. However, at a global level, analysis of variance indicated that the examiner had a minimal impact on the variance, with the majority of variance explained by the student performance on task. One anomaly noted was in the assessment of technical competency, whereby the examiner had a large impact on the rating, linked to assessing according to curriculum content. The thought processes behind judgement were diverse and if the qualitative results had been used in isolation, may have led to the researchers drawing conclusions that the examined performances would have yielded widely different ratings. However, as a cohort, the examiners were able to distinguish good and poor levels of competency with the majority of student competency linked to the varying ability of the student.

Entities:  

Keywords:  Assessment; Clinical competency; Examiner judgement; Exercise physiology

Mesh:

Year:  2016        PMID: 26796200     DOI: 10.1007/s10459-016-9665-x

Source DB:  PubMed          Journal:  Adv Health Sci Educ Theory Pract        ISSN: 1382-4996            Impact factor:   3.853


  6 in total

1.  Enhancing the defensibility of examiners' marks in high stake OSCEs.

Authors:  Boaz Shulruf; Arvin Damodaran; Phil Jones; Sean Kennedy; George Mangos; Anthony J O'Sullivan; Joel Rhee; Silas Taylor; Gary Velan; Peter Harris
Journal:  BMC Med Educ       Date:  2018-01-06       Impact factor: 2.463

2.  Exploring the influence of cultural orientations on assessment of communication behaviours during patient-practitioner interactions.

Authors:  Kyle J Wilby; Marjan J B Govaerts; Zubin Austin; Diana H J M Dolmans
Journal:  BMC Med Educ       Date:  2017-03-21       Impact factor: 2.463

3.  Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies.

Authors:  Mary Roduta Roberts; Megan Cook; Iris C I Chao
Journal:  BMC Med Educ       Date:  2020-05-25       Impact factor: 2.463

4.  Students' perception and scores in Paediatrics end-of-clerkship and final professional Objective Structured Clinical Examination (OSCE): A comparative study.

Authors:  Sabeen Abid Khan; Sahira Aaraj; Sidra Talat; Nismat Javed
Journal:  Pak J Med Sci       Date:  2021 Mar-Apr       Impact factor: 1.088

5.  Borderline grades in high stakes clinical examinations: resolving examiner uncertainty.

Authors:  Boaz Shulruf; Barbara-Ann Adelstein; Arvin Damodaran; Peter Harris; Sean Kennedy; Anthony O'Sullivan; Silas Taylor
Journal:  BMC Med Educ       Date:  2018-11-20       Impact factor: 2.463

6.  Insights into student assessment outcomes in rural clinical campuses.

Authors:  Boaz Shulruf; Gary Velan; Lesley Forster; Anthony O'Sullivan; Peter Harris; Silas Taylor
Journal:  BMC Med Educ       Date:  2019-10-18       Impact factor: 2.463

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.