Todd H Wagner1,2. 1. VA Health Economics Resource Center, 795 Willow Rd., 152-MPD, Menlo Park, CA, 94025, USA. twagner@stanford.edu. 2. Stanford-Surgery Policy Improvement Research and Education Center, Department of Surgery, Stanford School of Medicine , Stanford, CA, USA. twagner@stanford.edu.
Abstract
BACKGROUND: Hospitals and other health care delivery organizations are sometimes resistant to implementing evidence-based programs, citing unknown budgetary implications. OBJECTIVE: In this paper, I discuss challenges when estimating health care costs in implementation research. DESIGN: A case study with intensive care units highlights how including fixed costs can cloud a short-term analysis. PARTICIPANTS: None. INTERVENTIONS: None. MAIN MEASURES: Health care costs, charges and payments. KEY RESULTS: Cost data should accurately reflect the opportunity costs for the organization(s) providing care. Opportunity costs are defined as the benefits foregone because the resources were not used in the next best alternative. Because there is no database of opportunity costs, cost studies rely on accounting data, charges, or payments as proxies. Unfortunately, these proxies may not reflect the organization's opportunity costs, especially if the goal is to understand the budgetary impact in the next few years. CONCLUSIONS: Implementation researchers should exclude costs that are fixed in the time period of observation because these assets (e.g., space) cannot be used in the next best alternative. In addition, it is common to use costs from accounting databases where we implicitly assume health care providers are uniformly efficient. If providers are not operating efficiently, especially if there is variation in their efficiency, then this can create further problems. Implementation scientists should be judicious in their use of cost estimates from accounting data, otherwise research results can misguide decision makers.
BACKGROUND: Hospitals and other health care delivery organizations are sometimes resistant to implementing evidence-based programs, citing unknown budgetary implications. OBJECTIVE: In this paper, I discuss challenges when estimating health care costs in implementation research. DESIGN: A case study with intensive care units highlights how including fixed costs can cloud a short-term analysis. PARTICIPANTS: None. INTERVENTIONS: None. MAIN MEASURES: Health care costs, charges and payments. KEY RESULTS: Cost data should accurately reflect the opportunity costs for the organization(s) providing care. Opportunity costs are defined as the benefits foregone because the resources were not used in the next best alternative. Because there is no database of opportunity costs, cost studies rely on accounting data, charges, or payments as proxies. Unfortunately, these proxies may not reflect the organization's opportunity costs, especially if the goal is to understand the budgetary impact in the next few years. CONCLUSIONS: Implementation researchers should exclude costs that are fixed in the time period of observation because these assets (e.g., space) cannot be used in the next best alternative. In addition, it is common to use costs from accounting databases where we implicitly assume health care providers are uniformly efficient. If providers are not operating efficiently, especially if there is variation in their efficiency, then this can create further problems. Implementation scientists should be judicious in their use of cost estimates from accounting data, otherwise research results can misguide decision makers.
Entities:
Keywords:
budgets; cost-benefit analysis; costs and cost analysis; economic models
Authors: Susan L Ettner; David Huang; Elizabeth Evans; Danielle Rose Ash; Mary Hardy; Mickel Jourabchi; Yih-Ing Hser Journal: Health Serv Res Date: 2006-02 Impact factor: 3.402
Authors: Howard Chiou; Jeffrey K Jopling; Jennifer Yang Scott; Meghan Ramsey; Kelly Vranas; Todd H Wagner; Arnold Milstein Journal: BMJ Open Date: 2017-06-14 Impact factor: 2.692
Authors: Meghan C O'Leary; Kristen Hassmiller Lich; Leah Frerichs; Jennifer Leeman; Daniel S Reuland; Stephanie B Wheeler Journal: Implement Sci Date: 2022-04-15 Impact factor: 7.960
Authors: Ramzi G Salloum; Todd H Wagner; Amanda M Midboe; Sarah I Daniels; Andrew Quanbeck; David A Chambers Journal: Implement Sci Commun Date: 2022-09-24