| Literature DB >> 24600299 |
Jamiu O Busari1, Lorette A Stammen2, Lokke M Gennissen2, Rob M Moonen3.
Abstract
INTRODUCTION: The increasing demands for effective and efficient health care delivery systems worldwide have resulted in an expansion of the desired competencies that physicians need to possess upon graduation. Presently, medical residents require additional professional competencies that can prepare them to practice adequately in a continuously changing health care environment. Recent studies show that despite the importance of competency-based training, the development and evaluation of management competencies in residents during residency training is inadequate. The aim of this literature review was to find out which assessment methods are currently being used to evaluate trainees' management competencies and which, if any, of these methods make use of valid and reliable instruments.Entities:
Keywords: ACGME; CanMEDs; care management; competency; management
Year: 2014 PMID: 24600299 PMCID: PMC3933344 DOI: 10.2147/AMEP.S58476
Source DB: PubMed Journal: Adv Med Educ Pract ISSN: 1179-7258
Figure 1Flowchart illustrating the various stages of the inclusion process.
Summary of categories 1 and 2
| Study | Method | Target group (number) | Comments | Statements about assessment tool |
|---|---|---|---|---|
| Essex and Jackson | Pre- and post-test 65-item knowledge questionnaire (true/false/do not know). | N=40 | Highly significant increase in knowledge. | None |
| Junker et al | Pre- and post-test evaluation of perceived knowledge of PM (0–5 scale). | N=16 | 74% increase in perceived insurance issues knowledge; | None |
| Kochar et al | Development of a 360° assessment tool based on ACGME-competencies. | N=685 residents and fellows in 84 programs | Continuing redesigning residency trainings environment. | None |
| Bayard et al | Pre- and post-test evaluation. | Not reported | (Strongly) agreed that the course was beneficial; increased interest and knowledge in PM. | None |
| Higgins et al | Developed a (45-item and three open-ended questions) 360° evaluation tool based on ACGME-competencies. | N=6 residents | Offers a comprehensive picture of an individual’s performance, facilitates self-awareness. | None |
| Babitch and Chinsky | Pre- and post-test (MCQ) executive leadership in health care-management. | Not reported | Slight improvement in post-test. | None |
| Babitch | Pre- and post-test five questions about comprehension of the lecture content. ACGME and AAP-based curriculum. | Not reported | 20%–40% improvement in comprehension. | None |
| Hemmer et al | Pretest 20-item closed book and post-test 22- or 50-item open-book/note knowledge about key topics. Evaluation of curriculum content (scale 1–5). | N=16 | Short-term understanding of leadership and management concepts seems to improve. | Validity was not measured |
| Jones et al | Surgical coding compliance. | Not reported | Surgical compliance increased from 36%–90% over a 12-month period. | None |
| LoPresti et al | Pre- and post-test 40-item (25 MCQ + 15 Pick N type questions) knowledge test (constructed according to NBME-guidelines) to assess application based on AAFP and ACGME. | N=28 tests by 17 residents | Overall scores about PM knowledge improved in 7/14 areas in the intervention group. | Judged to have content validity |
| Frohna et al | OSCE, patient satisfaction, 360° evaluation, fishbowl evaluation. | Not reported | Features a multifaceted assessment system that includes elements of each of Miller’s four levels of competence. Critical for obtaining an accurate picture of learners’ competency and may allow finer gradations of competency. | None |
Abbreviations: AAFP, American Academy of Family Physicians; ACGME, Accreditation Council of Graduate Medical Education; MCQ, multiple-choice question; NBME, National Board of Medical Examiners; OSCE, objective structured clinical examination; PM, practice management; SP, standardized patient encounters; AAP, American Academy of Pediatrics.
Summary of categories 3 and 4
| Study | Assessment tool/method | Target group (number) | Comments |
|---|---|---|---|
| Reisdorff et al | Modified current evaluation instrument based on ACGME-competencies. | Not reported | A practical method was proposed for modifying an existing evaluation tool to use in assessing residents’ GCs. |
| Swing | Development and implementations of assessment methods based on ACGME framework; ratings, checklists, 360° evaluations, structural oral examinations, structured case discussion, simulators, models, simulations, and portfolios. | Not reported | Assessment approaches that tend to have a higher reliability are OSCEs, SP exams, and checklists. OSCE and SP exams seem to be the best method for conducting assessment for high-stakes decisions, advised to complement with snapshot assessment methods. 360° evaluation and portfolios may provide unique insights and should be further developed and tested. |
| Reisdorff et al | Modified a 61-item global assessment device. | N=150 emergency medicine residents | Valid evaluation items for assessing GCs were developed by using a structured process. It is challenging to measure objective dynamics in addition to more objective, behavior-driven competencies. |
| Silber et al | Modified global rating form assessment of 23 items based on ACGME competencies. | N=1,295 residents | Global rating forms were the most often used tool to assess residents’ GCs, may not be adequate to distinguish among the six ACGME GCs. |
| Weigelt et al | Developed a 23-item 360° evaluation based on the ACGME-competencies. | N=10 residents | Compared with traditional ratings, 360° evaluation adds limited new information. |
| Roark et al | Online assessment system for surgical post-graduate ACGME-based education to support implementation of various instruments; mini-clinical examination exercise, 360° assessment, oral presentation of research projects, written examination, in-service examination, and documentation of surgical experience. | N=1,336 evaluations residents otolaryngology | This study revealed significant difference among evaluators in total (all GCs) average score. Further conclusions are hard to draw and further experiments and research are needed. |
| Jefferies et al | Developed a 10-station OSCE based on the CanMEDS competencies. | N=24 candidates, 13 faculty members | The OSCE may, despite the resource intensity and high costs, be useful as a reliable, valid assessment method to simultaneously assess multiple GCs. |
| Chou et al | Two-question survey to measure use and satisfaction of: ITER, MCQ, SAQ, essay, oral examination, OSCE, simulation, and logbook. | N=280 program directors | ITER was the tool used mostly to assess management competency. Program directors concerned about assessment of management role in their programs. |
| Lee et al | Report experience using implementation matrix of ACGME competencies. | Not reported | Provided specific practical solutions for overcoming the identified barriers. |
| Lurie et al | A systematic review of assessment tools for ACGME competencies. | N=56 articles | No evidence found that current assessment tools can assess the ACGME GCs independently of one another. |
| Varney et al | Designed a developmental criterion-referenced assessment (CoBRA). | N=2,941 CoBRA | Labor-/time-intensive project, but with positive returns: ongoing faculty development, focus on evaluation as a part of teaching, and the explanatory effect on goals and objectives within our department. |
| Garstang et al | OSCE 9-station based on ACGME-competencies developed by OSCE committee (authors and department member). Compared with ABPMR scores. | N=9 residents | OSCE management scores had a (not significant) positive correlation with ABPMR part 1 scores. OSCE management scores had a (not significant) negative correlation with ABPMR part 2 scores. |
| Dyne et al | SBP assessment tools; direct observation, global rating, 360° evaluation, portfolios, standardized oral examinations, written MCQs, chart-stimulated recall oral examination, OSCE, and patient survey. | Not reported | Primary assessment methodologies: direct observation, global ratings, 360° evaluation, portfolio assessment. Testing: oral and written. |
| Kolva et al | A systematic review of ACGME-competencies in PM curriculum: | N=33 articles | Few studies on long-term outcomes of PM-related curricula. New PM curricular material developed should be evaluated and results should be published. |
Abbreviations: ABIM, American Board of Internal Medicine; ABPMR, American Board of Physical Medicine and Rehabilitation; ACGME, Accreditation Council for Graduate Medical Education; CanMEDS, Canadian Medical Education Directives for Specialists; CoBRA, competency-based resident assessment; GC, general competencies; ITER, in-training evaluation report; MCQ, multiple-choice question; OSCE, objective structured clinical examination; PM, practice management; SAQ, short-answer question; SBP, systems-based practice; SD, standard deviation; SP, standardized patient encounters.