Literature DB >> 17077987

Using a sampling strategy to address psychometric challenges in tutorial-based assessments.

Kevin W Eva1, Patty Solomon, Alan J Neville, Michael Ladouceur, Karyn Kaufman, Allyn Walsh, Geoffrey R Norman.   

Abstract

INTRODUCTION: Tutorial-based assessment, despite providing a good match with the philosophy adopted by educational programmes that emphasize small group learning, remains one of the greatest challenges for educators working in this context. The current study was performed in an attempt to assess the psychometric characteristics of tutorial-based evaluation upon adopting a multiple sampling approach that requires minimal recording of observations.
METHOD: After reviewing the literature, a simple 3-item evaluation form was created. The items were "Professional Behaviour," "Contribution to Group Process," and "Contribution to Group Content." Explicit definition of these items was provided on an evaluation form. Twenty five tutors in five different programmes were asked to use the form to evaluate their students (N=169) after every tutorial over the course of an academic unit. Each item was rated using a 10-point scale.
RESULTS: Cronbach's alpha revealed an appropriate internal consistency in all five programmes. Test-retest reliability of any single rating was low, but the reliability of the average rating was at least 0.75 in all cases. The construct validity of the tool was supported by the observation of increasing ratings over the course of the academic unit and by the finding that more senior students received higher ratings than more junior students.
CONCLUSION: Consistent with the context specificity phenomenon, the adoption of a "minimal observations often" approach to tutorial-based assessment appears to maintain better psychometric characteristics than do attempts to assess tutorial performance using more comprehensive measurement tools.

Mesh:

Year:  2006        PMID: 17077987     DOI: 10.1007/s10459-005-2327-z

Source DB:  PubMed          Journal:  Adv Health Sci Educ Theory Pract        ISSN: 1382-4996            Impact factor:   3.853


  4 in total

1.  Systems-based practice defined: taxonomy development and role identification for competency assessment of residents.

Authors:  Mark J Graham; Zoon Naqvi; John Encandela; Kelli J Harding; Madhabi Chatterji
Journal:  J Grad Med Educ       Date:  2009-09

2.  Students and tutors' social representations of assessment in problem-based learning tutorials supporting change.

Authors:  Valdes R Bollela; Manoel H C Gabarra; Caetano da Costa; Rita C P Lima
Journal:  BMC Med Educ       Date:  2009-06-07       Impact factor: 2.463

3.  OSCE rater cognition - an international multi-centre qualitative study.

Authors:  Sarah Hyde; Christine Fessey; Katharine Boursicot; Rhoda MacKenzie; Deirdre McGrath
Journal:  BMC Med Educ       Date:  2022-01-03       Impact factor: 2.463

4.  Multiple tutorial-based assessments: a generalizability study.

Authors:  Christina St-Onge; Eric Frenette; Daniel J Côté; André De Champlain
Journal:  BMC Med Educ       Date:  2014-02-15       Impact factor: 2.463

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.