Literature DB >> 28494811

The association between implementation strategy use and the uptake of hepatitis C treatment in a national sample.

Shari S Rogal1,2,3, Vera Yakovchenko4, Thomas J Waltz5,6, Byron J Powell7, JoAnn E Kirchner8, Enola K Proctor9, Rachel Gonzalez10, Angela Park11, David Ross12, Timothy R Morgan10, Maggie Chartier12, Matthew J Chinman13,14.   

Abstract

BACKGROUND: Hepatitis C virus (HCV) is a common and highly morbid illness. New medications that have much higher cure rates have become the new evidence-based practice in the field. Understanding the implementation of these new medications nationally provides an opportunity to advance the understanding of the role of implementation strategies in clinical outcomes on a large scale. The Expert Recommendations for Implementing Change (ERIC) study defined discrete implementation strategies and clustered these strategies into groups. The present evaluation assessed the use of these strategies and clusters in the context of HCV treatment across the US Department of Veterans Affairs (VA), Veterans Health Administration, the largest provider of HCV care nationally.
METHODS: A 73-item survey was developed and sent to all VA sites treating HCV via electronic survey, to assess whether or not a site used each ERIC-defined implementation strategy related to employing the new HCV medication in 2014. VA national data regarding the number of Veterans starting on the new HCV medications at each site were collected. The associations between treatment starts and number and type of implementation strategies were assessed.
RESULTS: A total of 80 (62%) sites responded. Respondents endorsed an average of 25 ± 14 strategies. The number of treatment starts was positively correlated with the total number of strategies endorsed (r = 0.43, p < 0.001). Quartile of treatment starts was significantly associated with the number of strategies endorsed (p < 0.01), with the top quartile endorsing a median of 33 strategies, compared to 15 strategies in the lowest quartile. There were significant differences in the types of strategies endorsed by sites in the highest and lowest quartiles of treatment starts. Four of the 10 top strategies for sites in the top quartile had significant correlations with treatment starts compared to only 1 of the 10 top strategies in the bottom quartile sites. Overall, only 3 of the top 15 most frequently used strategies were associated with treatment.
CONCLUSIONS: These results suggest that sites that used a greater number of implementation strategies were able to deliver more evidence-based treatment in HCV. The current assessment also demonstrates the feasibility of electronic self-reporting to evaluate ERIC strategies on a large scale. These results provide initial evidence for the clinical relevance of the ERIC strategies in a real-world implementation setting on a large scale. This is an initial step in identifying which strategies are associated with the uptake of evidence-based practices in nationwide healthcare systems.

Entities:  

Keywords:  Feasibility; Importance; Interferon-free medications

Mesh:

Substances:

Year:  2017        PMID: 28494811      PMCID: PMC5425997          DOI: 10.1186/s13012-017-0588-6

Source DB:  PubMed          Journal:  Implement Sci        ISSN: 1748-5908            Impact factor:   7.327


Background

A great deal of research now clearly shows that moving effective programs and practices into routine care settings requires the skillful use of implementation strategies, defined as “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical program or practice” [1]. Implementation strategies can vary widely and their labels can also vary. In order to generate a common nomenclature for implementation strategies and facilitate standardization of research methods in implementation science, the Expert Recommendations for Implementing Change (ERIC) study [2] engaged experts in modified-Delphi and concept mapping exercises to (1) refine a compilation of implementation strategies and (2) develop conceptually distinct categories of implementation strategies. This led to a compilation of 73 discrete implementation strategies (e.g., access new funding, audit and provide feedback, facilitation) [3], which were further organized into nine clusters [3]. These clusters include changing infrastructure, utilizing financial strategies, supporting clinicians, providing interactive assistance, training and educating stakeholders, adapting and tailoring to the context, developing stakeholder interrelationships, using evaluative and iterative strategies, and engaging consumers. This study is the first attempt to empirically determine whether these strategies and clusters of strategies are associated with the uptake of evidence-based practices (EBP) within the US Department of Veterans Affairs (VA), Veterans Health Administration (VHA). Hepatitis C virus (HCV) infection is a leading cause of cirrhosis and liver cancer [4] in the USA and in VA. VA is the single largest provider of HCV care in the USA with approximately 174,000 Veterans who were potentially eligible for treatment in 2015 [5]. In the past, HCV treatment required interferon-based therapies, which involved long courses of injections and had numerous side effects and contraindications. These barriers resulted in only 23% of Veterans with HCV ever receiving treatment with these regimens [6]. Starting in December 2013, the first interferon-free drug combinations, Direct Acting Antivirals (DAAs), with substantially fewer side effects, shorter treatment course, and a higher cure rate, were FDA approved for specific genotypes of HCV. By October 2014 or the start of fiscal year (FY) 2015, interferon-free combinations were available for all genotypes of HCV, making them the new evidence-based practice for treating HCV [7-13]. However, these innovative treatments and their high costs posed a significant challenge to VA, requiring the healthcare system to adjust policies, resource availability, and staffing [6]. Reaching and treating Veterans infected with HCV required significant restructuring to more rapidly deliver medications, expand the reach of treatment to Veterans who were previously ineligible, and bring Veterans into care who were not previously engaged in care. In order to accomplish these goals and facilitate the uptake of the EBP, VA developed a novel program, the Hepatitis C Innovation Team Collaborative (HIT). In FY 2015, all 21 regional administrative centers, or Veterans Integrated Service Networks (VISNs), were directed to form multidisciplinary teams. Team members represented multiple sites (e.g., medical centers and outpatient clinics) within their region. Teams were both financially and organizationally supported to develop strategies, utilizing Lean principles of quality improvement [14], to increase treatment rates and improve access to and quality of HCV care. The engagement of local providers in the Collaborative varied by site, and despite the centralized support of providers, local providers were free to select implementation strategies. The focus of this assessment was on understanding which strategies were chosen and the impact of these strategies on treatment outcomes. The availability of interferon-free HCV treatments on the national VA pharmacy formulary, sufficient funding to provide broad access to these medications, and local flexibility to choose implementation strategies across the VHA represented a unique laboratory in which to understand how a variety of implementation strategies affected the uptake of a highly evidence-based innovation. We hypothesized that the number of implementation strategies endorsed would be associated with increased uptake of the innovation (i.e., increased starts of interferon-free medications).

Methods

Overview

We assessed the uptake of strategies as defined by the ERIC project [2]. To develop the implementation assessment, the original 73 ERIC strategy descriptions were tailored to interferon-free treatment in VA and the strategies were organized by cluster. The survey was iteratively vetted by HIT leader stakeholders, five HCV treatment providers, and a psychometrician for readability and understandability. Care was taken to ensure fidelity to the original, previously defined strategies. Table 1 shows the questions in order of presentation organized by cluster. The survey asked about strategies used in fiscal year 2015 (FY15) to increase interferon-free HCV treatment at their VA medical center (VAMC). For each strategy, participants were asked, “Did you use X strategy to promote HCV care in your center?” The questions were anchored to FY15 so they could be linked to treatment data over the same time period. The Pittsburgh VA IRB determined that the initiative was exempt under a provision applying to quality improvement. This assessment was approved as a quality improvement project by the VA’s HIV, Hepatitis and Related Conditions Programs in the Office of Specialty Care Services as a part of the HIT Collaborative evaluation. All participation was completely voluntary.
Table 1

Strategies by cluster and correlation with treatment starts

No.StrategySites N (%)Correlation P value
In FY15 did your center use any of these infrastructure changes to promote HCV care in your center?
1 Change physical structure and equipment (e.g., purchase a FibroScan, expand clinic space, open new clinics) 42 (53) 0.36 <0.01
2Change the record systems (e.g., locally create new or update to national clinical reminder in CPRS, develop standardized note templates)57 (71)−0.020.89
3 Change the location of clinical service sites (e.g., extend HCV care to the CBOCs) 21 (26) 0.36 <0.01
4Develop a separate organization or group responsible for disseminating HCV care (outside of the HIT Collaborative)18 (23) 0.210.07
5Mandate changes to HCV care (e.g., when you changed to the new HCV medications was this based on a leadership mandate?)44 (55)0.050.69
6Create or change credentialing and/or licensure standards (e.g., change scopes of practice or service agreements)23 (29)0.010.92
7 Participate in liability reform efforts that make clinicians more willing to deliver the clinical innovation 3 (4) 0.23 0.04
8 Change accreditation or membership requirements 3 (4) 0.23 0.04
In FY15 did your center use any of these financial strategies to promote HCV care in your center?
9Access new funding (This DOES NOT include funding from national VA for the medications, but should include receiving funds from the HIT Collaborative to your center)24 (30) 0.200.08
10Alter incentive/allowance structures4 (5) 0.040.76
11Provide financial disincentives for failure to implement or use the clinical innovations0..
12Respond to proposals to deliver HCV care (e.g., submit a HIT proposal to obtain money for your center specifically)35 (44) 0.190.11
13Change billing (e.g., create new clinic codes for billing for HCV treatment or HCV education)9 (11) 0.170.15
14Place HCV medications on the formulary56 (70) −0.050.67
15Alter patient fees0
16Use capitated payments0
17Use other payment schemes4 (5) 0.220.06
18 Create new clinical teams (e.g., interdisciplinary clinical working groups) 37 (46) 0.25 0.04
19Facilitate the relay of clinical data to providers (e.g., provide outcome data to providers)45 (56) 0.200.09
20 Revise professional roles (e.g., allow the pharmacist to see and treat patients in the clinic) 57 (71) 0.24 0.04
21Develop reminder systems for clinicians (e.g., use CPRS reminders)27 (34) −0.160.19
22 Develop resource sharing agreements (e.g., partner with the VERC, the HITs, or other organizations with the resources to help implement changes) 21 (26) 0.24 0.04
In FY15 did your center employ any of these activities to provide interactive assistance to promote HCV care in your center?
23Use outside assistance often called “facilitation” (e.g., coaching, education, and/or feedback from the facilitator)6 (8) 0.160.17
24 Have someone from inside the clinic or center (often called “local technical assistance”) tasked with assisting the clinic 12 (15) 0.38 <0.01
25 Provide clinical supervision (e.g., train providers) 35 (44) 0.29 0.01
26 Use a centralized system (i.e., from the VISN) to deliver facilitation 22 (28) 0.38 <0.01
In FY15 did your center employ any of these activities to tailor HCV care in your center?
27Use data experts to manage HCV data (e.g., use the VERC, pharmacy benefits management, VISN, or CCR data experts to track patients or promote care)46 (58) 0.180.12
28Use data warehousing techniques (e.g., dashboard, clinical case registry, CDW)68 (85) 0.150.19
29Tailor strategies to deliver HCV care (i.e., alter HCV care to address barriers to care that you identified in your population using data you collected)50 (63) 0.210.08
30Promote adaptability (i.e., Identify the ways HCV care can be tailored to meet local needs and clarify which elements of care must be maintained to preserve fidelity)44 (55) 0.160.17
In FY15 did your center employ any of these activities to train or educate providers to promote HCV care in your center?
31 Conduct educational meetings 41 (51) 0.24 0.05
32 Have an expert in HCV care meet with providers to educate them 33 (41) 0.34 <0.01
33 Provide ongoing HCV training 39 (49) 0.26 0.03
34 Facilitate the formation of groups of providers and fostered a collaborative learning environment 35 (44) 0.38 <0.01
35Developed formal educational materials31 (39) 0.000.97
36Distribute educational materials (e.g., guidelines, manuals, or toolkits)44 (55) 0.110.35
37Provide ongoing consultation with one or more HCV treatment experts46 (58) 0.110.37
38Train designated clinicians to train others (e.g., primary care providers, SCAN-ECHO)16 (20) −0.070.56
39 Vary the information delivery methods to cater to different learning styles when presenting new information 29 (36) 0.29 0.02
40Give providers opportunities to shadow other experts in HCV26 (33) 0.120.32
41Use educational institutions to train clinicians9 (11) 0.210.07
In FY15 did your center employ any of these activities to develop stakeholder interrelationships to promote HCV care in your center?
42 Build a local coalition/team to address challenges 42 (53) 0.27 0.03
43 Conduct local consensus discussions (i.e., determine how to change things by having meetings with local leaders and providers) 38 (48) 0.42 <0.01
44Obtain formal written commitments from key partners that state what they will do to implement HCV care (e.g., written agreements with CBOCS)3 (4) 0.200.09
45 Recruit, designate, and/or train leaders 21 (26) 0.29 0.01
46 Inform local opinion leaders about advances in HCV care 39 (49) 0.33 <0.01
47 Share the knowledge gained from quality improvement efforts with other sites outside your medical center 30 (38) 0.32 <0.01
48 Identify and prepare champions (i.e., select key individuals who will dedicate themselves to promoting HCV care) 40 (50) 0.29 0.01
49Organize support teams of clinicians who are caring for patients with HCV and given them time to share the lessons learned and support one another’s learning21 (26)0.160.18
50Use advisory boards and interdisciplinary workgroups to provide input into HCV policies and elicit recommendations21 (26)0.090.46
51Seek the guidance of experts in implementation35 (44)−0.010.92
52 Build on existing high-quality working relationships and networks to promote information sharing and problem solving related to implementing HCV care 49 (61) 0.24 0.04
53 Use modeling or simulated change 10 (13) 0.25 0.04
54 Partner with a university to share ideas 11 (14) 0.27 0.02
55 Make efforts to identify early adopters to learn from their experiences 13 (16) 0.32 <0.01
56 Visit other sites outside your medical center to try to learn from their experiences 12 (15) 0.30 0.01
57Develop an implementation glossary2 (3) 0.170.15
58Involve executive boards18 (23) 0.150.21
In FY15 did your center employ any of these evaluative and iterative strategies to promote HCV care in your center? 2 (3)
59Assess for readiness and identify barriers and facilitators to change (e.g., administer the organizational readiness to change survey)21 (26) 0.160.20
60Conduct a local needs assessment (i.e., collect data to determine how best to change things)36 (45) 0.120.31
61Develop a formal implementation blueprint (i.e., make a written plan of goals and strategies)27 (34) 0.110.37
62Start with small pilot studies and then scale them up18 (23) 0.080.50
63 Collect and summarize clinical performance data and give it to clinicians and administrators to implement changes in a cyclical fashion using small tests of change before making system-wide changes 17 (21) 0.25 0.04
64Conduct small tests of change, measured outcomes, and then refined these tests15 (19) 0.110.36
65Develop and use tools for quality monitoring (this includes standards, protocols and measures to monitor quality)33 (41) 0.070.56
66Develop and organize systems that monitor clinical processes and/or outcomes for the purpose of quality assurance and improvement (i.e., create an overall system for monitoring quality--not just tools to use in quality monitoring, which is addressed in the last item)24 (30) 0.180.14
67Intentionally examine the efforts to promote HCV care49 (61) 0.080.49
68Develop strategies to obtain and use patient and family feedback16 (20) −0.110.35
In FY15 did your center employ any of these strategies to engage patient consumers to promote HCV care in your center?
69Involve patients/consumers and family members40 (50) 0.010.91
70 Engage in efforts to prepare patients to be active participants in HCV care (e.g., conduct education sessions to teach patients about what questions to ask about HCV treatment) 50 (63) 0.39 <0.01
71Intervene with patients/consumers to promote uptake and adherence to HCV treatment57 (71)0.080.51
72Use mass media (e.g., local public service announcements; magazines like VANGUARD, newsletters, online/social media outlets) to reach large numbers of people14 (18)0.000.98
73Promote demand for HCV care among patients through any other means32 (40)0.190.12

Statistically significant strategies are represented in italics

Strategies by cluster and correlation with treatment starts Statistically significant strategies are represented in italics

Participation sites and recruitment

The HIT Collaborative provided the contact information for VA HCV providers and HIT members representing all 130 individual VA sites. The sites, for the purpose of this assessment, were defined as all VA medical “stations” as classified by Population Health Services of the VA. These stations include a larger medical center, and some of these stations have smaller satellite sites. All sites within a station would be included in the measures of treatment starts for the station. However, most of the treatment starts are coordinated by and occur in the larger medical center within these stations. This assessment included all stations, herein deemed “sites,” regardless of participation in the HIT Collaborative. Respondents were surveyed via an email link to an online survey portal. A modified Dillman approach [15] was used to promote high response, with two mass emails and one individual email to reach potential participants. In order to maximize survey completion rates, the recruitment email was cosigned by the HIT Collaborative Leadership team. Additionally, the Leadership Team provided coordinated communication with HIT members on regularly scheduled calls so that providers were aware of the assessment and its purpose. At sites with multiple respondents, we assessed interrater reliability but ultimately retained one respondent per site. The retention of one respondent was based on the “key informant” technique, [16] in part to reduce bias of increased reporting from the sites with duplicate responses. If an HCV lead clinician, designated by the VAMC, was available and responded, their answer was retained; otherwise, if there were multiple respondents, they were prioritized by who would know the most about HCV treatment in the following order (established a priori): physician, pharmacist, advanced practice provider (nurse practitioner or physician assistant), other provider, and system redesign staff.

Measures and data collection

The primary outcome of interest was the number of Veterans started on the new interferon-free medications for HCV during FY15 from each VAMC (deemed “treatment starts”), which was obtained from VA’s population health intranet database [5]. A secondary outcome was the proportion of viremic patients so treated, assessed by dividing the number of patients started on the medications by the number of Veterans with known active HCV infection in need of treatment at each site. VA uses a Clinical Case Registry to validate HCV cases at each site, and the numbers are reported nationally. A local coordinator is sent the results of patients who are found to be HCV positive and the coordinator at that site determines whether the patient truly has HCV. Veterans are considered to be viremic if they have a positive viral load on their most recent testing, are alive at the end of the year of interest, and have been validated as HCV positive through the CCR. In order to be a part of a site’s patient load, they had to be considered “in care,” meaning that they needed to have had an encounter or medication fill within the prior 365 days at that medical facility. Descriptive statistics were used to describe the frequency of strategy and cluster endorsement and the association between strategy use and medication starts. Data were analyzed as follows: (1) the total number of strategies and (2) the number of strategies within a cluster. As described in Waltz et al. [17], strategies were grouped into quadrants via combinations of importance (i.e., how vital a strategy was rated to be in improving implementation, grouped into low and high categories) and feasibility (i.e., how possible a certain strategy is to do, also grouped in to low and high categories). Using the definitions per Waltz et al., these quadrants included high feasibility/high importance strategies (quadrant 1), low importance/high feasibility (quadrant 2), low importance/low feasibility (quadrant 3), and high importance/low feasibility (quadrant 4) [3]. A key covariate used in the analyses is “site complexity.” Site complexity in VA was assessed using ratings from the VHA Facility Complexity Model. First created in 1989 and regularly updated [18], the ratings combine site level of care acuity, services available, research dollars, and patients served. The ratings include standardized classifications into levels 1a, 1b, 1c, 2, and 3, in decreasing levels of complexity. Thus, level 1a facilities have high volume, high-risk patients, the most complex clinical programs, and largest teaching and research programs, and level 3 programs have low volume, low-risk patients, few or no complex clinical programs, and small or no research or teaching programs [18]. We had conducted a prior survey of the programs treating HCV, asking respondents to self-report the number of providers treating HCV at each VA site. These data were added to the dataset in order to determine whether staffing related to treatment starts and the proportion of viremic patients treated.

Data analysis

All analyses were conducted using the R statistical package. Non-parametric statistical tests were used to assess the associations between dependent and independent variables including Wilcoxon rank-sum testing for treatment by strategy and Spearman’s correlation testing for continuous variables. Interrater reliability was calculated for sites with duplicate responses. We analyzed whether treatment was associated with using high-feasibility or high-importance strategies and also strategies in the “Go-zone” of high feasibility and importance (zone 1). A multivariable linear regression model was made to assess whether treatment starts were associated with implementation strategies, controlling for facility characteristics.

Results

Of 130 unique stations that are engaged in HCV treatment in VA contacted for the assessment, 80 provided responses (62%). These 80 sites were responsible for 68% of national HCV treatment starts in FY15 (n = 20,503). A total of 133 responses were obtained; 53 were omitted from further analysis (15 opened the survey and did not respond to any questions, 29 were duplicate responses from 19 sites, and 9 did not list an associated VAMC and could not be linked to outcomes). Among responses within 19 duplicate sites, interrater reliability was 0.66. All 21 VA-defined regions of the country (VISNs, as defined in FY15) were represented. Table 2 illustrates respondent characteristics and shows that respondents were predominantly from specialty services of gastroenterology/hepatology or infectious disease. Pharmacists made up the largest group of respondents. There were no significant associations between participant characteristics and treatment starts or number of strategies used. The majority of sites represented were complexity level 1 (Table 2). In addition to standard complexity scoring, we analyzed the number of providers treating HCV during the FY of interest from a prior assessment. These data were available for 49 of the sites (61%). The median number of providers treating HCV in these sites was 5 (3,7), including physicians, advanced practitioners, pharmacists, and nurses.
Table 2

Respondent characteristics

CharacteristicN (sites)Percentage
Years in VA
 <31316
 4 to 92531
 10 to 192531
 >201721
Specialty
 Gastroenterology/hepatology3341
 Infectious disease1721
 Pharmacy1316
 Primary care810
 Other (VERC, transplant)911
Degree
 PharmD3544
 NP1316
 MD1114
 PA56
 RN23
 Other1418
Site complexity
 1a2733
 1b1418
 1c1215
 21418
 31215
Respondent characteristics Among responding sites, the median number of treatment starts was 197 (IQR = 124, 312). The median number of Veterans potentially eligible for treatment at the start of FY15 at these sites was 1149 (IQR = 636, 1744). The proportion of viremic patients treated ranged from 6 to 47%, with a mean (SD) of 20 ± 8%. The sites that did not respond had a median number of treatment starts of 142 (IQR = 88, 296), which was not significantly different from responding sites (p = 0.07). The median number of Veterans potentially eligible for treatment at the start of FY15 at these non-responding sites was 874 (IQR = 494, 1716), which was not significantly different than responding sites (p = 0.43). The proportion of viremic patients treated ranged from <1–46%, with a mean of 17 ± 10%, which was not statistically different than responding sites (p = 0.15). Table 1 shows the strategies in order of presentation on the survey with the questions as they were asked. Sites endorsed between 1 and 59 strategies, with an average of 25 ± 14. Quartile of treatment starts was significantly associated with the number of strategies endorsed (p < 0.01), with the top quartile endorsing a median of 33 strategies, compared to 15 strategies in the lowest quartile. Table 1 shows the frequency of the strategy endorsement, and the association between strategies and treatment starts. The most frequently used strategies included data warehousing techniques, (e.g., using a dashboard; 85%), and intervening with patients to promote uptake and adherence to HCV treatment (71%). A total of 28 of the 73 specific strategies were associated with treatment starts (Table 1, in bold). Associations between treatment starts and strategies were assessed with correlation coefficients and Wilcoxon rank-sum tests in order to assess for stability of results. Notably, the strategies that were significant were consistent regardless of the type of statistical test applied. Sites using at least 15 of these significant strategies (n = 20) had increased median treatment starts (320 vs. 177, p < 0.001). There was a moderate, positive correlation between overall total number of strategies and treatment starts, which was statistically significant (r = 0.43, p < 0.001). The strategies most correlated with medication initiations were “conduct local consensus discussions” (0.42, p < 0.001), “engage in efforts to prepare patients to be active participants in HCV care” (0.39, p < 0.001), “facilitate the formation of groups of providers and [foster] a collaborative learning environment” (0.38, p = 0.001), “use a centralized system to deliver facilitation” (0.38, p = 0.001), “have someone from inside the clinic or center (often called local technical assistance) tasked with assisting the clinic” (0.38, p = 0.001), “revise professional roles” (0.36, p = 0.053), “conduct local consensus discussions” (0.36, p = 0.002), “change physical structure and equipment” (0.36, p = 0.002), and “change location of clinical service sites” (0.36, p = 0.001). The strategies were from different clusters, though there were two strategies each from the interactive assistance and infrastructure change clusters. In order to assess the impact of strategies on treatment starts while accounting for facility characteristics, a multivariable model was made assessing whether the number of strategies used remained associated with treatment while controlling for facility complexity. In this regression model, facility complexity and number of strategies were both significantly associated with treatment starts. The adjusted R 2 for the model was 30%. The individual strategies were assessed by cluster to determine the relationships between clusters of strategies and treatment starts. Figure 1 shows the frequency with which strategies were endorsed by cluster.
Fig. 1

Endorsement of strategies by cluster

Endorsement of strategies by cluster Table 3 shows that the number of strategies used within a cluster was also significantly associated with treatment starts for each individual cluster. The clusters with the highest proportion of significant strategies were “provide interactive assistance” (75% of strategies significantly associated with treatment) and “develop stakeholder interrelationships” (64% of strategies significantly associated with treatment) and those with the least were “tailor to the context” and “financial strategies.” None of the individual financial strategies or tailoring strategies were associated with treatment starts. Otherwise, there was at least one strategy significantly correlated with number of treatment starts in every cluster (Table 1).
Table 3

Correlation between items endorsed in the cluster and treatment starts

Implementation strategy clustersNumber of strategiesNumber of Endorsements (number per strategy in cluster)Correlation between number of strategies used within the cluster and treatment starts R 2 P valueNumber (%) of strategies in the cluster associated with treatment starts
Provide interactive assistance475 (19)0.4621%<0.0013 (75%)
Develop stakeholder relationships17405 (24)0.4420%<0.00111 (64%)
Train and educate stakeholders11349 (32)0.3311%0.0035 (45%)
Adapt and tailor to context4208 (52)0.3110%0.0040 (0%)
Change infrastructure8211 (26)0.299%0.0084 (50%)
Support clinicians5187 (37)0.298%0.0093 (60%)
Engage consumer5193 (39)0.277%0.0161 (20%)
Financial strategies9141 (16)0.267%0.0200 (0%)
Use evaluative and iterative strategies10191 (19)0.235%0.0431 (10%)
Correlation between items endorsed in the cluster and treatment starts Figure 1 graphically depicts the density of endorsement across clusters. Notably, the most commonly used strategies were not those most strongly associated with treatment starts. While “providing interactive assistance” was the most strongly associated with treatment, this was among the least-endorsed clusters. Conversely, “adapting and tailoring to the context” was among the most highly and densely endorsed clusters but there were no strategies within this cluster associated with treatment. Waltz et al. grouped strategies into quadrants by perceived feasibility (low/high) and importance (low/high), and a composite score [17]. Table 4 demonstrates how strategies in these quadrants were used by respondents. Multiple strategies were applied from each quadrant indicating that perceived low feasibility, as defined in the ERIC project, was not a barrier to reported uptake of strategies. Treatment starts were associated with the number of strategies in all quadrants. The number of both “high feasibility” and “high importance” strategies endorsed by sites were associated with number of treatment starts (r = 0.37, p < 0.001 and r = 0.39, p < 0.001, respectively). However, having a higher proportion of strategies from the “high importance” or “high feasibility” groups was not associated with higher treatment rates. In fact, the raw correlation coefficients were higher between treatment starts and the low feasibility quadrants’ association with treatment starts than for the high feasibility quadrants.
Table 4

Quadrant assessment

QuadrantDescriptionNumber of strategies in quadrantNumber of endorsements of strategies in quadrant by respondentsEndorsements per strategyNumber of strategies associated with treatment starts in quadrant (% of strategies in quadrant)Correlation between number strategies used in quadrant and treatment starts r (p)Correlation between number strategies used in quadrant and number viremic r (p)
1High importance, high feasibility319663110 (32%)0.35 (0.002)0.38 (<0.001)
2Low importance, high feasibility11215205 (45%)0.37 (<0.001)0.38 (<0.001)
3Low importance, low feasibility22542259 (41%)0.44 (<0.001)0.37 (<0.001)
4High importance, low feasibility9293334 (44%)0.44 (<0.001)0.41 (<0.001)
Quadrant assessment Table 5 illustrates the strategies that were most commonly endorsed among sites in the high and low quartiles of treatment starts. Five of the strategies were shared between the groups (denoted in red in the table), and five were distinct between the groups of sites. There were differences in the strategies used by high and low quartile sites: low quartile sites were more likely to endorse “mandating change” and “changing the record system” while high quartile sites were more likely to “change equipment and physical structures” as well as to “facilitate relay of clinical data to providers.”
Table 5

Most commonly used strategies in the top and bottom quartile of treatment starts

Top treating quartileClusterNQuadrantBottom treating quartileCluster N Quadrant
Revise professional rolesa Support clinicians143Intentionally examine the efforts to promote HCV careEvaluative91
Identify and prepare championsa Interrelationships141Place HCV medications on the formularyFinancial134
Tailor strategies to deliver HCV careTailor151Provide ongoing consultation with one or more HCV treatment expertsTrain/educate91
Engage in efforts to prepare patients to be active participants in HCV carea Consumers164Mandate changes to HCV careInfrastructure133
Change the record systemsInfrastructure143Develop reminder systems for cliniciansSupport92
Intervene with patients/consumers to promote uptake and adherence to HCV treatmentConsumers174Intervene with patients/consumers to promote uptake and adherence to HCV treatmentConsumers144
Use data warehousing techniquesTailor193Use data warehousing techniquesTailor163
Distribute educational materialsTrain/educate141Distribute educational materialsTrain/educate91
Facilitate the relay of clinical data to providersSupport151Facilitate the relay of clinical data to providersSupport111
Build on existing high-quality working relationships and networks to promote information sharing and problem solving related to implementing HCV carea Interrelationships153Build on existing high-quality working relationships and networks to promote information sharing and problem solving related to implementing HCV carea Interrelationships93

aStrategies significantly correlated with treatment starts (see Table 2)

Most commonly used strategies in the top and bottom quartile of treatment starts aStrategies significantly correlated with treatment starts (see Table 2) In addition to assessing treatment starts, the proportion of viremic patients treated was assessed. This measure was not significantly associated with use of any particular strategy or cluster, number of strategies used overall, or with facility complexity or number of providers. Figure 2 illustrates graphically that there was no drop-off in participation as the assessment progressed, by illustrating the responses by number of strategies endorsed and by order of the assessment. Participants endorsed as many items at the end of the survey as at the start of the survey, including participants who endorsed <5 items. This suggests that there was not a bias towards selecting items earlier in the assessment. This figure also graphically illustrates the density of endorsement by strategy and cluster.
Fig. 2

Strategies in order of presentation on the survey by participant. This density plot represents strategy endorsements made by each participant in the order in which they were presented in the survey from left to right. Respondents, represented by rows, were sorted by the number of strategies they endorsed with those endorsing the least at the top of the plot and those endorsing the most at the bottom

Strategies in order of presentation on the survey by participant. This density plot represents strategy endorsements made by each participant in the order in which they were presented in the survey from left to right. Respondents, represented by rows, were sorted by the number of strategies they endorsed with those endorsing the least at the top of the plot and those endorsing the most at the bottom

Discussion

Implementation strategies that VA medical centers used to promote HCV treatment with interferon-free medications were assessed based on the nomenclature developed in the ERIC study. Specifically, we explored whether the ERIC strategies and clusters of strategies were associated with implementation of an innovative evidence-based practice, in this case the use of interferon-free medications for HCV. The use of interferon-free medications for HCV is highly evidence-based, is uniformly and simply applied, widely implemented, and easily documented and extractable from the medical record in a reliable fashion. This makes the use of interferon-free medications for HCV an ideal case in which to understand how implementation strategies function in a real-world context on a large scale. The presented data suggest that the ERIC strategies are empirically related to the use of interferon-free medications in a large, nationwide, scale-up effort to increase the uptake of this evidence-based practice. These data support the hypothesis that the use of more strategies is associated with increased use of interferon-free medications for HCV. However, the most commonly used strategies and clusters of strategies were not those associated with the highest number of treatment starts. Only 3 of the top-endorsed 15 strategies were correlated with treatment starts. These included the following strategies: “revise professional roles,” “build on existing high-quality working relationships and networks to promote information sharing and problem solving,” and “engage in efforts to prepare patients to be active participants.” Whether the 12 non-correlated strategies serve meaningful supportive roles for those strategies that do correlate with treatment starts (i.e., that they may be necessary but not sufficient) is unknown. The clusters with the highest percentage of strategies associated with treatment rates were “providing interactive assistance,” “supporting clinicians,” and “developing stakeholder interrelationships.” The importance of these kinds of interactive and supportive strategies is consistent with the literature on implementation “facilitation,” in which outside aid is provided to help sites with a wide range of activities needed to start new initiatives [19]. Overall sites used multiple strategies including those that were previously identified as less feasible by implementation researchers and clinical managers [3]. In fact, the two “low feasibility” quadrants (both high and low importance) had numerically higher correlations with treatment starts than the “high feasibility” quadrants. It is possible that the sites able to conduct more difficult and complex (i.e., less feasible) implementations strategies, including interactive assistance, were able to treat more patients. Additionally, these findings may indicate that feasibility and importance could vary across innovation, context, and recipients. An alternative explanation is that the feasibility of strategies as identified in a theoretical process (as was ERIC) does not accurately reflect what is applied in routine clinical implementation. Future research may need to assess how perceptions of feasibility and importance operate across different clinical contexts. The data regarding successful strategies are consistent with prior implementation research regarding efficacy of specific strategies. For example, “mandating change” was a strategy often endorsed by the sites in the lowest quartile of treatment and not by the higher treating sites, which is consistent with prior evidence for the minimal effectiveness of top-down mandates in implementing change [20, 21]. Overall, five of the top ten strategies for the top and bottom quartiles were shared by both groups. The sites in the highest quartile of treatment tended to use more interpersonally focused strategies (3 of 5 distinct strategies including “revising professional roles,” “preparing champions,” and “preparing patients”) while only 1 of the 5 distinct strategies in the lowest quartile of treatment was interpersonally focused (e.g., “consultation”). However, there were not specific clusters or quadrants that predicted being in the highest vs. lowest quartile of treatment. None of the strategies within the financial cluster were significantly associated with treatment starts. There are a variety of financial strategies and several are not applicable within VA. For example, sites do not have the flexibility to change billing, provide financial incentives, use capitated payments, or alter patient fees. Sites do have the ability to create new clinic codes, for example, if pharmacists have new clinics, and a few sites endorsed this item. “Accessing new funding” via grants or the HIT program and “responding to proposals” were items that were frequently endorsed by sites. These were not associated with treatment starts, possibly because the financial component provided by the HIT was modest. One financial strategy of note was adding medications to the formulary. VA fiscal and formulary changes are nationally applied. Thus, the HCV medications were placed on the national formulary at the same time for all sites. However, it is notable that the low treating sites were more likely to endorse “placing HCV medications on the formulary” as among their most frequently-used strategies. This is likely a reflection of the lack of other active local strategies being used in these sites. These sites were more likely to attribute the national formulary change to the actions of their site than the sites that were more active in their implementation. It is possible that an association between the strategies used and treatment starts could be due simply to facility characteristics. For example, larger sites may have more capacity in the form of staffing and resources to engage in treatment. However, this was not found in this sample. The number of staff was not significantly associated with the number of treatment starts. Similarly, the number of providers was not significantly correlated with the number of strategies or the number of significant strategies. More complex medical centers were more likely to choose the strategies that were significantly associated with treatment starts. Future work should focus on the interactions between the types of strategies used and facility complexity. This assessment was an example of adapting the ERIC strategies to a specific clinical issue. While the survey items maintained fidelity to each of the 73 implementation strategies in ERIC, they were also tailored to HCV with specific examples. The process required vetting the survey with stakeholders to ensure that the examples were relevant and understandable. Presenting complex implementation terms in long lists to stakeholders was found to be feasible with a reasonable response rate in this national sample. The high interrater reliability among participants from sites with multiple respondents suggests that participants are interpreting the strategies consistently within the sites. However, more qualitative work will be required to determine how community stakeholders do or do not distinguish between particular strategies. This general approach of using a structured survey could be used to track strategy use over time in implementation research and practice. Failing to accurately track strategy use limits abilities to explain how and why specific efforts succeed or fail. This type of approach could move implementation science towards a more comprehensive understanding of how various strategies operate across settings and disease states. There is currently a paucity of measures to assess implementation strategy use [22] and this study contributes to the literature by providing one approach to assess strategy use through an online survey. There were several limitations of this evaluation. Despite the fact that this was a national assessment with high response rates, the total number of respondents was small compared to the number of strategies that were investigated, which limited the power to assess multiple variables in the same models. This was a cross-sectional assessment and future assessments will allow us to better understand longitudinal associations of the strategies with treatment rates over time. Over FY15, there were multiple national policy shifts that dramatically and transiently impacted funding for HCV treatment. These changes likely impacted treatment rates, independent of implementation strategy use. However, given that these changes occurred on a national level secular changes would theoretically affect sites uniformly. In this investigation we were unable to assess interactions between policy changes and implementation strategies, though this would be of interest. While these assessments within VA allowed us to examine these implementation strategies in a national system, the external validity in a non-VA system will need to be investigated in future studies. The effects of patient factors, particularly gender, will also need to be assessed, given the predominance of men in the VA system. While we chose a key informant technique, using one respondent per center, and had high interrater reliability where there was more than one respondent, future investigations should assess who and how to sample stakeholders. Additionally, the strategies can be interpreted differently by different stakeholders. For example “placing HCV medications on the formulary” was interpreted as a site-level strategy by some sites and not others, despite the fact that VA has a national formulary. While treatment starts were significantly and moderately correlated with strategies, other factors are likely involved in determining the number of treatment starts. These likely include patient factors and unmeasured site factors. It is also possible that sites employed additional implementation strategies that were not captured by the ERIC taxonomy or that they were not able to recall. Also, sites’ endorsements of strategies do not explain how the strategies were used or the extent of reach. While our methods did not allow us to understand the sequencing, intensity, or fidelity to each implementation strategy, the results demonstrate the feasibility of assessing a wide range of strategies nationally and the importance of strategy choice even in the context of a simple and highly evidence-based practice. While this study is a first step towards understanding the role of strategy selection in clinical outcomes, there are several avenues for future research. Qualitative work could also help in determining the rationale for strategy choices and the perceptions of effectiveness. Future research should also assess the effectiveness of directing sites to specific strategies that have been thoughtfully selected based upon theory, evidence, and stakeholder input. In the past, implementation experts have advocated for systematic methods for selecting strategies based on evidence, theory, and stakeholder input [23]. Additionally an understanding of the cost effectiveness of specific strategies would certainly add to the field.

Conclusions

This study is a first step towards better understanding how specific implementation strategies and clusters of strategies influence the uptake of a highly-evidence based, uniform innovation. The results demonstrate that the number of implementation strategies was associated with a meaningful clinical outcome. These results provide initial evidence for the clinical relevance of the ERIC strategies in a real-world setting on a large scale.
  18 in total

1.  How to obtain excellent response rates when surveying physicians.

Authors:  C Thorpe; B Ryan; S L McLean; A Burt; M Stewart; J B Brown; G J Reid; S Harris
Journal:  Fam Pract       Date:  2008-12-12       Impact factor: 2.267

Review 2.  Update on Current Evidence for Hepatitis C Therapeutic Options in HCV Mono-infected Patients.

Authors:  Mark W Hull; Eric M Yoshida; Julio S G Montaner
Journal:  Curr Infect Dis Rep       Date:  2016-07       Impact factor: 3.725

3.  Impact of state mandatory insurance coverage on the use of diabetes preventive care.

Authors:  Rui Li; Ping Zhang; Lawrence Barker; Dekeely Hartsfield
Journal:  BMC Health Serv Res       Date:  2010-05-21       Impact factor: 2.655

4.  Ledipasvir and sofosbuvir for untreated HCV genotype 1 infection.

Authors:  Nezam Afdhal; Stefan Zeuzem; Paul Kwo; Mario Chojkier; Norman Gitlin; Massimo Puoti; Manuel Romero-Gomez; Jean-Pierre Zarski; Kosh Agarwal; Peter Buggisch; Graham R Foster; Norbert Bräu; Maria Buti; Ira M Jacobson; G Mani Subramanian; Xiao Ding; Hongmei Mo; Jenny C Yang; Phillip S Pang; William T Symonds; John G McHutchison; Andrew J Muir; Alessandra Mangia; Patrick Marcellin
Journal:  N Engl J Med       Date:  2014-04-11       Impact factor: 91.245

5.  Treatment of HCV with ABT-450/r-ombitasvir and dasabuvir with ribavirin.

Authors:  Jordan J Feld; Kris V Kowdley; Eoin Coakley; Samuel Sigal; David R Nelson; Darrell Crawford; Ola Weiland; Humberto Aguilar; Junyuan Xiong; Tami Pilot-Matias; Barbara DaSilva-Tillmann; Lois Larsen; Thomas Podsadecki; Barry Bernstein
Journal:  N Engl J Med       Date:  2014-04-10       Impact factor: 91.245

6.  Global, regional, and national age-sex specific all-cause and cause-specific mortality for 240 causes of death, 1990-2013: a systematic analysis for the Global Burden of Disease Study 2013.

Authors: 
Journal:  Lancet       Date:  2014-12-18       Impact factor: 79.321

7.  The association between state mandates of colorectal cancer screening coverage and colorectal cancer screening utilization among US adults aged 50 to 64 years with health insurance.

Authors:  Vilma Cokkinides; Priti Bandi; Mona Shah; Katherine Virgo; Elizabeth Ward
Journal:  BMC Health Serv Res       Date:  2011-01-27       Impact factor: 2.655

8.  A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project.

Authors:  Byron J Powell; Thomas J Waltz; Matthew J Chinman; Laura J Damschroder; Jeffrey L Smith; Monica M Matthieu; Enola K Proctor; JoAnn E Kirchner
Journal:  Implement Sci       Date:  2015-02-12       Impact factor: 7.327

9.  Ledipasvir and sofosbuvir for previously treated HCV genotype 1 infection.

Authors:  Nezam Afdhal; K Rajender Reddy; David R Nelson; Eric Lawitz; Stuart C Gordon; Eugene Schiff; Ronald Nahass; Reem Ghalib; Norman Gitlin; Robert Herring; Jacob Lalezari; Ziad H Younes; Paul J Pockros; Adrian M Di Bisceglie; Sanjeev Arora; G Mani Subramanian; Yanni Zhu; Hadas Dvory-Sobol; Jenny C Yang; Phillip S Pang; William T Symonds; John G McHutchison; Andrew J Muir; Mark Sulkowski; Paul Kwo
Journal:  N Engl J Med       Date:  2014-04-11       Impact factor: 91.245

10.  Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study.

Authors:  Thomas J Waltz; Byron J Powell; Monica M Matthieu; Laura J Damschroder; Matthew J Chinman; Jeffrey L Smith; Enola K Proctor; JoAnn E Kirchner
Journal:  Implement Sci       Date:  2015-08-07       Impact factor: 7.327

View more
  29 in total

Review 1.  Getting a clinical innovation into practice: An introduction to implementation strategies.

Authors:  JoAnn E Kirchner; Jeffrey L Smith; Byron J Powell; Thomas J Waltz; Enola K Proctor
Journal:  Psychiatry Res       Date:  2019-07-02       Impact factor: 3.222

2.  Advancing Evidence Synthesis from Effectiveness to Implementation: Integration of Implementation Measures into Evidence Reviews.

Authors:  Aaron A Tierney; Marie C Haverfield; Mark P McGovern; Donna M Zulman
Journal:  J Gen Intern Med       Date:  2019-12-17       Impact factor: 5.128

3.  Establishing a perioperative medicine for older people undergoing surgery service for general surgical patients at a district general hospital.

Authors:  Ruth de Las Casas; Catherine Meilak; Anna Whittle; Judith Partridge; Jacek Adamek; Euan Sadler; Nick Sevdalis; Jugdeep Dhesi
Journal:  Clin Med (Lond)       Date:  2021-11       Impact factor: 2.659

4.  Specifying and Reporting Implementation Strategies Used in a School-Based Prevention Efficacy Trial.

Authors:  Stephanie A Moore; Kimberly T Arnold; Rinad S Beidas; Tamar Mendelson
Journal:  Implement Res Pract       Date:  2021-11-08

5.  Strategy Configurations Directly Linked to Higher Hepatitis C Virus Treatment Starts: An Applied Use of Configurational Comparative Methods.

Authors:  Vera Yakovchenko; Edward J Miech; Matthew J Chinman; Maggie Chartier; Rachel Gonzalez; JoAnn E Kirchner; Timothy R Morgan; Angela Park; Byron J Powell; Enola K Proctor; David Ross; Thomas J Waltz; Shari S Rogal
Journal:  Med Care       Date:  2020-05       Impact factor: 2.983

6.  The Pursuit of Quality for Social Work Practice: Three Generations and Counting.

Authors:  Enola Proctor
Journal:  J Soc Social Work Res       Date:  2018-07-18

7.  Using Implementation Mapping to Develop Implementation Strategies for the Delivery of a Cancer Prevention and Control Phone Navigation Program: A Collaboration With 2-1-1.

Authors:  Lynn N Ibekwe; Timothy J Walker; Ebun Ebunlomo; Katharine Ball Ricks; Sapna Prasad; Lara S Savas; Maria E Fernandez
Journal:  Health Promot Pract       Date:  2020-10-09

8.  Implementation strategies in the context of medication reconciliation: a qualitative study.

Authors:  Deonni P Stolldorf; Sheila H Ridner; Timothy J Vogus; Christianne L Roumie; Jeffrey L Schnipper; Mary S Dietrich; David G Schlundt; Sunil Kripalani
Journal:  Implement Sci Commun       Date:  2021-06-10

9.  The Hepatic Innovation Team Collaborative: A Successful Population-Based Approach to Hepatocellular Carcinoma Surveillance.

Authors:  Shari S Rogal; Vera Yakovchenko; Rachel Gonzalez; Angela Park; Lauren A Beste; Karine Rozenberg-Ben-Dror; Jasmohan S Bajaj; Dawn Scott; Heather McCurdy; Emily Comstock; Michael Sidorovic; Sandra Gibson; Carolyn Lamorte; Anna Nobbe; Maggie Chartier; David Ross; Jason A Dominitz; Timothy R Morgan
Journal:  Cancers (Basel)       Date:  2021-05-07       Impact factor: 6.639

Review 10.  Innovations in Hepatitis C Screening and Treatment.

Authors:  Arpan A Patel; Aileen Bui; Eian Prohl; Debika Bhattacharya; Su Wang; Andrea D Branch; Ponni V Perumalswami
Journal:  Hepatol Commun       Date:  2020-12-07
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.