| Literature DB >> 18279503 |
Cheryl B Stetler1, Brian S Mittman, Joseph Francis.
Abstract
BACKGROUND: Continuing challenges to timely adoption of evidence-based clinical practices in healthcare have generated intense interest in the development and application of new implementation methods and frameworks. These challenges led the United States (U.S.) Department of Veterans Affairs (VA) to create the Quality Enhancement Research Initiative (QUERI) in the late 1990s. QUERI's purpose was to harness VA's health services research expertise and resources in an ongoing system-wide effort to improve the performance of the VA healthcare system and, thus, quality of care for veterans. QUERI in turn created a systematic means of involving VA researchers both in enhancing VA healthcare quality, by implementing evidence-based practices, and in contributing to the continuing development of implementation science. The efforts of VA researchers to improve healthcare delivery practices through QUERI and related initiatives are documented in a growing body of literature. The scientific frameworks and methodological approaches developed and employed by QUERI are less well described. A QUERI Series of articles in Implementation Science will illustrate many of these QUERI tools. This Overview article introduces both QUERI and the Series.Entities:
Year: 2008 PMID: 18279503 PMCID: PMC2289837 DOI: 10.1186/1748-5908-3-8
Source DB: PubMed Journal: Implement Sci ISSN: 1748-5908 Impact factor: 7.327
Summary and description of expanded six-step QUERI process model
| 1A. Identify and prioritize (via a formal ranking procedure) |
| 1B. Identify high-priority clinical practices and outcomes within a selected condition |
| ▪ |
| ▪ |
| ▪ |
| 2A. Identify evidence-based clinical practice guidelines |
| 2B. Identify evidence-based clinical recommendations |
| 2C. Identify evidence-based clinical practices |
| ▪ |
| 3A. Measure existing practice patterns and outcomes across VA and identify variations from evidence-based practices ("quality/performance gaps") |
| 3B. Identify determinants of current practices |
| 3C. Diagnose quality/performance gaps |
| 3D. Identify barriers and facilitators to improvement |
| ▪ |
| ▪ |
| 4A. Identify improvement/implementation strategies, programs and program components or tools |
| 4B. Develop or adapt improvement/implementation strategies, programs and program components or tools |
| 4C. Implement improvement/implementation strategies/programs to address quality gaps |
| ▪ |
| ▪ |
| ▪ |
| 5. Assess improvement program feasibility, implementation and impacts on patient, family and healthcare system processes and outcomes |
| 6. Assess improvement program impacts on health related quality of life (HRQOL) |
| ▪ |
| ▪ |
| ▪ |
| M1. Develop, refine and validate patient registries and databases documenting healthcare organizational features, clinical practices and utilization, and outcomes. |
| M2. Develop and/or evaluate case-finding and screening tools. |
| M3. Develop and/or evaluate measures of healthcare structures, processes and outcomes. |
| C1. Develop and evaluate evidence-based clinical practices and recommendations (clinical research). |
| C2. Develop and evaluate evidence-based health services interventions (health services research). |
| ▪ |
QUERI phases of implementation projects/QUERI pipeline
| ◆ Small scale study within a single clinic or facility |
| ◆ Used with a substantiated clinical or delivery best practice |
| ◆ Identifies potential issues relative to routine integration of best practice such as acceptability of the recommendation, process barriers, and needed toolkit elements |
| ◆ Relatively modest but multi-site evaluation (e.g., 4-6 facilities within one or two VA regions) |
| ◆ Conducted within a formal research and evaluation framework, e.g., an experimental design. Usually is a hybrid design, i.e., a traditional intervention design plus a descriptive formative evaluation [9] |
| ◆ Requires active research team support and involvement, plus modest real-time refinements to maximize the likelihood of success and to study the process for replication requirements |
| ◆ Enables refinement before larger-scale implementation |
| ◆ Test of large-scale adoption program prior to full VA implementation with 10-20 facilities in 3-5 VA regions |
| ◆ Decreased research team support at local sites and greater involvement of stakeholders, both nationally and locally |
| ◆ Should require less need for real-time refinements of the implementation strategy |
| ◆ Preparation for hand-off at national level |
| ◆ Implementation of a tested, refined strategy throughout the VA |
| ◆ Existing operations or designated leadership entity deliver the program |
| ◆ Research team support as determined per Phase 3 evaluation |
| ◆ Concurrent and ongoing evaluation, per methodology determined/refined in Phase 3 |