| Literature DB >> 35127059 |
Guillaume Lamé1,2, Sonya Crowe3, Matthew Barclay1.
Abstract
Despite an increasing number of papers reporting applications of operational research (OR) to problems in healthcare, there remains little empirical evidence of OR improving healthcare delivery in practice. Without such evidence it is harder both to justify the usefulness of OR to a healthcare audience and to learn and continuously improve our approaches. To progress, we need to build the evidence-base on whether and how OR improves healthcare delivery through careful empirical evaluation. This position paper reviews evaluation standards in healthcare improvement research and dispels some common myths about evaluation. It highlights the current lack of robust evaluation of healthcare OR and makes the case for addressing this. It then proposes possible ways for building better empirical evaluations of OR interventions in healthcare.Entities:
Keywords: Health Quality and Evaluation; healthcare Improvement Research; impact
Year: 2020 PMID: 35127059 PMCID: PMC8812794 DOI: 10.1080/20476965.2020.1857663
Source DB: PubMed Journal: Health Syst (Basingstoke) ISSN: 2047-6965
Figure 1.Positioning current evaluations of OR approaches in a landscape of evaluation approaches.
Figure 2.Generic model of the outcomes of OR interventions (similarities can be noted with the four stages of success in simulation projects suggested by Robinson and Pidd, 1998).
Recommendations
Think about Think about evaluation from the beginning.Develop a monitoring and evaluation plan when designing OR interventions. What would success look like for this intervention? How could it be assessed? What could go wrong? What type of data is needed for the evaluation?Include follow-up periods in projects to assess the impact of OR interventions. Collaborate with specialists in qualitative and quantitative evaluation to frame and conduct evaluations. Effective models exist for such collaborations between intervention designers, implementers and evaluators (Brewster et al., Use a programme theory to model interventions and design their evaluation. Theory-driven evaluation is in line with past recommendations in OR (Midgley, Adapt evaluation designs to interventions, contexts of implementation and resources (Eccles et al., Think in research programmes rather than projects, with replication of the same intervention across multiple sites. This allows comparisons of the effect of the same intervention (e.g., a group model-building approach, an optimisation strategy or a problem structuring intervention) across settings. Incorporate evaluation in OR training curricula, including creating professional development modules on evaluating OR. Support the publication of empirical evaluations in OR journals and prompt authors to mention the practical outcome of OR interventions in their articles or to write follow-up articles on the implementation (or lack thereof). |