Literature DB >> 23488754

Rethinking programme evaluation in health professions education: beyond 'did it work?'.

Faizal Haji1, Marie-Paule Morin, Kathryn Parker.   

Abstract

CONTEXT: For nearly 40 years, outcome-based models have dominated programme evaluation in health professions education. However, there is increasing recognition that these models cannot address the complexities of the health professions context and studies employing alternative evaluation approaches that are appearing in the literature. A similar paradigm shift occurred over 50 years ago in the broader discipline of programme evaluation. Understanding the development of contemporary paradigms within this field provides important insights to support the evolution of programme evaluation in the health professions.
METHODS: In this discussion paper, we review the historical roots of programme evaluation as a discipline, demonstrating parallels with the dominant approach to evaluation in the health professions. In tracing the evolution of contemporary paradigms within this field, we demonstrate how their aim is not only to judge a programme's merit or worth, but also to generate information for curriculum designers seeking to adapt programmes to evolving contexts, and researchers seeking to generate knowledge to inform the work of others. DISCUSSION: From this evolution, we distil seven essential elements of educational programmes that should be evaluated to achieve the stated goals. Our formulation is not a prescriptive method for conducting programme evaluation; rather, we use these elements as a guide for the development of a holistic 'programme of evaluation' that involves multiple stakeholders, uses a combination of available models and methods, and occurs throughout the life of a programme. Thus, these elements provide a roadmap for the programme evaluation process, which allows evaluators to move beyond asking whether a programme worked, to establishing how it worked, why it worked and what else happened. By engaging in this process, evaluators will generate a sound understanding of the relationships among programmes, the contexts in which they operate, and the outcomes that result from them. © Blackwell Publishing Ltd 2013.

Mesh:

Year:  2013        PMID: 23488754     DOI: 10.1111/medu.12091

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   6.251


  32 in total

1.  Using Data From Program Evaluations for Qualitative Research.

Authors:  Dorene F Balmer; Jennifer A Rama; Maria Athina Tina Martimianakis; Terese Stenfors-Hayes
Journal:  J Grad Med Educ       Date:  2016-12

2.  Identifying High-Impact and Managing Low-Impact Assessment Practices.

Authors:  Kristin K Janke; Katherine A Kelley; Beth A Martin; Mary E Ray; Burgunda V Sweet
Journal:  Am J Pharm Educ       Date:  2019-09       Impact factor: 2.047

Review 3.  A Proposed Framework to Develop, Describe and Evaluate Peer-Assisted Learning Programs.

Authors:  Mohammad Balilah; Mohammad Babgi; Walaa Alnemari; Ahmad Binjabi; Rania Zaini; Altaf Abdulkhaliq; Alaa Monjed; Salwa Aldahlawi; Hani Almoallim
Journal:  Adv Med Educ Pract       Date:  2020-12-22

4.  Use of simulation in teaching haematological aspects to undergraduate medical students improves student's knowledge related to the taught theoretical underpinnings.

Authors:  Laila Alsuwaidi; Jorgen Kristensen; Amar Hk; Saba Al Heialy
Journal:  BMC Med Educ       Date:  2021-05-12       Impact factor: 2.463

5.  A theory-informed, process-oriented Resident Scholarship Program.

Authors:  Satid Thammasitboon; John B Darby; Amy B Hair; Karen M Rose; Mark A Ward; Teri L Turner; Dorene F Balmer
Journal:  Med Educ Online       Date:  2016-06-14

6.  Evaluating the accessibility and utility of HIV-related point-of-care diagnostics for maternal health in rural South Africa: a study protocol.

Authors:  T P Mashamba-Thompson; P K Drain; B Sartorius
Journal:  BMJ Open       Date:  2016-06-27       Impact factor: 2.692

7.  Process-oriented evaluation of an international faculty development program for Asian developing countries: a qualitative study.

Authors:  Do-Hwan Kim; Jong-Hyuk Lee; Jean Park; Jwa-Seop Shin
Journal:  BMC Med Educ       Date:  2017-12-21       Impact factor: 2.463

8.  Parent training programmes for managing infantile colic.

Authors:  Morris Gordon; Jesal Gohil; Shel Sc Banks
Journal:  Cochrane Database Syst Rev       Date:  2019-12-03

9.  Practising what we preach: using cognitive load theory for workshop design and evaluation.

Authors:  Laura M Naismith; Faizal A Haji; Matthew Sibbald; Jeffrey J H Cheung; Walter Tavares; Rodrigo B Cavalcanti
Journal:  Perspect Med Educ       Date:  2015-12

10.  Writing Technical Reports for Simulation in Education for Health Professionals: Suggested Guidelines.

Authors:  Adam Dubrowski; Sabrina Alani; Tina Bankovic; Andrea Crowe; Megan Pollard
Journal:  Cureus       Date:  2015-11-02
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.