Literature DB >> 35877960

Wrestling with the bottom line in medical education.

Anél Wiese1, Deirdre Bennett1.   

Abstract

Entities:  

Mesh:

Year:  2022        PMID: 35877960      PMCID: PMC9545516          DOI: 10.1111/medu.14884

Source DB:  PubMed          Journal:  Med Educ        ISSN: 0308-0110            Impact factor:   7.647


× No keyword cloud information.
In this issue of Medical Education, Orlik et al. report the extent, nature and range of literature on economic evaluations of continuing professional development (CPD). They uncovered a paucity of literature in this area and variable quality where such studies did take place, leaving a substantial gap in our understanding of the cost and value of CPD. With these findings and against the backdrop of rising health care and education expenditure, Orlik and colleagues urge us to rapidly expand the quality and quantity of economic evaluations to maximise educational gains cost‐effectively. We agree that cost‐consciousness and the judicious use of resources are critical health care and education priorities. Money matters, and it is a professional responsibility of all stakeholders in health professions education to be stewards of resources. As academics who engage in medical education research around how and why (versus at what cost) educational interventions work, this call to broaden the scope of economic evaluations has made us wonder how it can be achieved, what it would mean and what it would look like to have a broader evidence base surrounding the economics of education. We are not economists nor experts in economic evaluation methodologies. Therefore, rather than assuming to have definitive answers, we hope to use this commentary to continue the conversation others have started , about the many unanswered questions that remain about the future of economic evaluation in health professions education and research. This call to broaden the scope of economic evaluations has made us wonder how it can be achieved, what it would mean and what it would look like. To evaluate if education interventions have value, we must understand their context (i.e. the conditions or environments that shape the success or failure of an educational programme to generate outcomes ). Economic evaluations have historically been blind to context. This presents a problem regarding effectively using economic evaluation to understand cost and outcome variation. Resources for the implementation of educational interventions vary depending on existing infrastructure. There can be substantial heterogeneity across sites, populations and perspectives, and the combined costs and benefits may occur at different periods during implementation. The best way to ensure robust economic evaluation is to compare alternative interventions that address the same goals, serve similar populations and use similar methods to measure the desirable outcomes. To us, these seem like unrealistic conditions considering the complexity of learning, an ongoing, multifaceted and continuous change process and modern understanding that opposes the idea of linear cause and effect. Orlik and colleagues, along with others, suggest standardising the methods used for educational interventions such as CPD programmes and standardised frameworks for reporting the economic impact of these programmes and activities. , Uniformity across studies would aid comparability, but would it bring us closer to achieving our educational goals or would it amount to changing education for the sake of research rather than conducting research to improve education? Economic evaluations have historically been blind to context. This presents a problem regarding effectively using economic evaluation to understand cost and outcome variation. Despite its intuitive appeal, there are reasons to be cautious about the role of standardisation that could impose boundaries on the scope of economic evaluation and limit the usefulness of its findings. Operationally, decision‐makers may not be able to respond to the implied priorities of economic evaluation because of a lack of flexibility and relevance for local systems. Furthermore, creating optimal conditions to generate comparative data for economic analyses might have the unintended consequence of impeding our ability to adapt educational design to generate desired learning outcomes in the long run. Economic evaluations flourish best in areas where outcomes are easily quantifiable, such as the direct costs or economic utility of diagnosis, treatment or health outcomes (e.g. quality‐adjusted life‐year). Knowledge and skills can be quantified to some extent, depending on the assessment setting, but a wide range of educational outcomes cannot be calculated or expressed in monetary terms. Professional identity formation, for example, is a critical outcome that cannot be quantified because it emerges over time through a trajectory of meaningful experiences. Creating optimal conditions to generate comparative data for economic analyses might have the unintended consequence of impeding our ability to adapt educational design. As constructivists, we adopt a holistic view of learning and accept that learning is not restricted to what is consciously apparent and empirically identifiable. Constructivist views, which underpin a large proportion of education activities, are seemingly incompatible with positivist assumptions of economic evaluation. This points to an important epistemological clash between orthodox economic evaluation methods and fit‐for‐purpose methods tailored to the complexity of our field. If the trade‐off between standardisation and complexity highlights several difficulties, how can stakeholders across settings and views generate, interpret, and apply implementation costs and cost‐effectiveness findings? To address this dilemma, Rees et al. , have done preliminary work to introduce a novel methodology combining economic evaluation and realist methods. Building on the proposition that integrating explanations of resource use and cost‐effectiveness can be done within realist evaluation; this approach lays the foundation for how to go about explaining why educational programmes are economically optimal in specific contexts. That is, Rees and colleagues' work offers an exciting opportunity to address the context dilemma of economic evaluations. However, this approach is still in the early stages of development and combining economic analysis with the principles of realist evaluation is not without its challenges. This points to an important epistemological clash between orthodox economic evaluation methods and fit‐for‐purpose methods tailored to the complexity of our field. All that said, it is important to recognise that economic evaluation is still in its infancy in health education. Although we have yet to see evidence that it has been substantially influential in driving curricular change, we suspect that greater energy directed at providing robust economic research will lead decision‐makers to use these analyses to make decisions about educational investment. That makes it all the more important that the issues outlined above are carefully and wisely considered early in this area's development. In addition to having an instrumental purpose, embedding complexity thinking in economic evaluation offers potential gains in shaping key stakeholders' thinking regarding educational issues. There could be rich learning involved in this type of research that may not have an immediate and direct impact on curricular changes but rather influence the way we conceptualise learning and the values that shape our field's practices. We suspect that greater energy directed at providing robust economic research will lead decision‐makers to use these analyses to make decisions about educational investment. To achieve that, we will need to move from the low base that has been set when it comes to economic evaluation , , towards improving the scope and quality of research on offer in this area. Bringing these types of analyses more in line with the values and goals of our field would strengthen its utility. Others have suggested that specialised research such as economic evaluation should be left to those better equipped for it2 in a way that implies it is best left to researchers outside our field. This idea has merit, but on the other hand, we have a long tradition of adopting methodology from outside and applying them successfully to our needs. There is no reason to believe we cannot achieve this again with economic evaluation methods. Before committing resources to upskilling researchers and broadening research approaches, we should continue, however, to reflexively discuss how best to approach such endeavours to serve our field's needs and priorities.
  11 in total

1.  Cost analyses approaches in medical education: there are no simple solutions.

Authors:  Kieran Walsh; Henry Levin; Peter Jaye; James Gazzard
Journal:  Med Educ       Date:  2013-10       Impact factor: 6.251

2.  Cost evaluations in health professions education: a systematic review of methods and reporting quality.

Authors:  Jonathan Foo; David A Cook; Kieran Walsh; Robert Golub; Mohamed Elhassan Abdalla; Dragan Ilic; Stephen Maloney
Journal:  Med Educ       Date:  2019-08-11       Impact factor: 6.251

3.  The applicability of generalisability and bias to health professions education's research.

Authors:  Lara Varpio; Bridget O'Brien; Charlotte E Rees; Lynn Monrouxe; Rola Ajjawi; Elise Paradis
Journal:  Med Educ       Date:  2020-09-27       Impact factor: 6.251

4.  You can't put a value on that… Or can you? Economic evaluation in simulation-based medical education.

Authors:  Debra Nestel; Victoria Brazil; Margaret Hay
Journal:  Med Educ       Date:  2018-02       Impact factor: 6.251

5.  Costs and Economic Impacts of Physician Continuous Professional Development: A Systematic Scoping Review.

Authors:  David A Cook; Christopher R Stephenson; John M Wilkinson; Stephen Maloney; Barbara L Baasch Thomas; Larry J Prokop; Jonathan Foo
Journal:  Acad Med       Date:  2022-01-01       Impact factor: 6.893

6.  Unpacking economic programme theory for supervision training: Preliminary steps towards realist economic evaluation.

Authors:  Charlotte E Rees; Jonathan Foo; Van N B Nguyen; Vicki Edouard; Stephen Maloney; Ella Ottrey; Claire Palermo
Journal:  Med Educ       Date:  2021-12-19       Impact factor: 6.251

7.  Balancing the effectiveness and cost of online education: A preliminary realist economic evaluation.

Authors:  Charlotte E Rees; Van N B Nguyen; Jonathan Foo; Vicki Edouard; Stephen Maloney; Claire Palermo
Journal:  Med Teach       Date:  2022-04-06       Impact factor: 4.277

8.  Realism and resources: Towards more explanatory economic evaluation.

Authors:  Rob Anderson; Rebecca Hardwick
Journal:  Evaluation (Lond)       Date:  2016-06-11

9.  Wrestling with the bottom line in medical education.

Authors:  Anél Wiese; Deirdre Bennett
Journal:  Med Educ       Date:  2022-08-02       Impact factor: 7.647

Review 10.  Economic evaluation of CPD activities for healthcare professionals: A scoping review.

Authors:  Witold Orlik; Giuseppe Aleo; Thomas Kearns; Jonathan Briody; Jane Wray; Paul Mahon; Mario Gazić; Normela Radoš; Cristina García Vivar; Manuel Lillo Crespo; Catherine Fitzgerald
Journal:  Med Educ       Date:  2022-04-29       Impact factor: 7.647

View more
  1 in total

1.  Wrestling with the bottom line in medical education.

Authors:  Anél Wiese; Deirdre Bennett
Journal:  Med Educ       Date:  2022-08-02       Impact factor: 7.647

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.