Literature DB >> 22796085

Assessment of CanMEDS roles in postgraduate training: the validation of the Compass.

Fred Tromp1, Myrra Vernooij-Dassen, Richard Grol, Anneke Kramer, Ben Bottema.   

Abstract

OBJECTIVE: In medical education the focus has shifted from gaining knowledge to developing competencies. To effectively monitor performance in practice throughout the entire training, a new approach of assessment is needed. This study aimed to evaluate an instrument that monitors the development of competencies during postgraduate training in the setting of training of general practice: the Competency Assessment List (Compass).
METHODS: The distribution of scores, reliability, validity, responsiveness and feasibility of the Compass were evaluated.
RESULTS: Scores of the Compass ranged from 1 to 9 on a 10-point scale, showing excellent internal consistency ranging from .89 to .94. Most trainees showed improving ratings during training. Medium to large effect sizes (.31-1.41) were demonstrated when we compared mean scores of three consecutive periods. Content validity of the Compass was supported by the results of a qualitative study using the RAND modified Delphi Method. The feasibility of the Compass was demonstrated.
CONCLUSION: The Compass is a competency based instrument that shows in a reliable and valid way trainees' progress towards the standard of performance. PRACTICE IMPLICATIONS: The programmatic approach of the Compass could be applied in other specialties provided that the instrument is tailored to specific needs of that specialism.
Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

Entities:  

Mesh:

Year:  2012        PMID: 22796085     DOI: 10.1016/j.pec.2012.06.028

Source DB:  PubMed          Journal:  Patient Educ Couns        ISSN: 0738-3991


  8 in total

1.  On Rating Angels: The Halo Effect and Straight Line Scoring.

Authors:  Jonathan Sherbino; Geoff Norman
Journal:  J Grad Med Educ       Date:  2017-12

2.  Development of a Postgraduate Community Pharmacist Specialization Program Using CanMEDS Competencies, and Entrustable Professional Activities.

Authors:  Marnix P D Westein; Harry de Vries; Annemieke Floor; Andries S Koster; Henk Buurma
Journal:  Am J Pharm Educ       Date:  2019-08       Impact factor: 2.047

3.  Assessing family medicine trainees--what can we learn from the European neighbours?

Authors:  Elisabeth Flum; Roar Maagaard; Maciek Godycki-Cwirko; Nigel Scarborough; Nynke Scherpbier; Thomas Ledig; Marco Roos; Jost Steinhäuser
Journal:  GMS Z Med Ausbild       Date:  2015-05-13

4.  Designing faculty development to support the evaluation of resident competency in the intrinsic CanMEDS roles: practical outcomes of an assessment of program director needs.

Authors:  Derek Puddester; Colla J MacDonald; Debbie Clements; Jane Gaffney; Lorne Wiesenfeld
Journal:  BMC Med Educ       Date:  2015-06-05       Impact factor: 2.463

5.  A practical approach to programmatic assessment design.

Authors:  A A Timmerman; J Dijkstra
Journal:  Adv Health Sci Educ Theory Pract       Date:  2017-01-24       Impact factor: 3.853

Review 6.  Assessing medical professionalism: A systematic review of instruments and their measurement properties.

Authors:  Honghe Li; Ning Ding; Yuanyuan Zhang; Yang Liu; Deliang Wen
Journal:  PLoS One       Date:  2017-05-12       Impact factor: 3.240

7.  How is self-regulated learning documented in e-portfolios of trainees? A content analysis.

Authors:  R van der Gulden; S Heeneman; A W M Kramer; R F J M Laan; N D Scherpbier-de Haan; B P A Thoonen
Journal:  BMC Med Educ       Date:  2020-06-26       Impact factor: 2.463

8.  Unprofessional behaviour of GP residents and its remediation: a qualitative study among supervisors and faculty.

Authors:  Pieter C Barnhoorn; Vera Nierkens; Marianne C Mak-van der Vossen; Mattijs E Numans; Walther N K A van Mook; Anneke W M Kramer
Journal:  BMC Fam Pract       Date:  2021-12-20       Impact factor: 2.497

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.