| Literature DB >> 32651805 |
Angela M Stover1,2, Lotte Haverman3, Hedy A van Oers3, Joanne Greenhalgh4, Caroline M Potter5.
Abstract
PURPOSE: Patient-reported outcome and experience measures (PROMs/PREMs) are well established in research for many health conditions, but barriers persist for implementing them in routine care. Implementation science (IS) offers a potential way forward, but its application has been limited for PROMs/PREMs.Entities:
Keywords: Clinical practice; Implementation science; Patient-reported outcome measures; Quality of life; Routine care
Mesh:
Year: 2020 PMID: 32651805 PMCID: PMC8528754 DOI: 10.1007/s11136-020-02564-9
Source DB: PubMed Journal: Qual Life Res ISSN: 0962-9343 Impact factor: 4.147
Fig. 1Relationships between PROM/PREM implementation strategies, implementation science outcomes, and patient outcomes
Fig. 2Four case studies
Key features of widely used implementation science frameworks or theories
| Implementation framework or theory | Nilsen [ | Constructs influencing implementation | Case stud(ies) |
|---|---|---|---|
Consolidated Framework for Implementation Research (CFIR) [ | Determinant framework: categorizes implementation barriers/enablers | Ahmed et al. [ van Oers et al. [ Manalili and Santana [ | |
| Theoretical Domains Framework (TDF) [ | Determinant framework: categorizes implementation barriers/enablers | Knowledge, skills Professional role/identity Beliefs about capabilities Beliefs about consequences Reinforcement Intentions/goals Environmental context and resources Social influence Memory, attention, decision influences Behavioral regulation | Ahmed et al. [ |
Integrated framework for Promoting Action on Research Implementation in Health Services (i-PARIHS) [ | Determinant framework: categorizes implementation barriers/enablers | Person or organization assigned to do work of facilitation (implementation support) Characteristics of innovation Degree of fit with existing practice and values Usability Relative advantage Trialability/observable results Clinical experiences/perceptions Patient experiences, needs, preferences Leadership support Culture, receptivity to change Evaluation capabilities | Roberts et al. [ |
Knowledge to Action (KTA) [ | Process model: describes practical steps in translating research to practice | Knowledge inquiry Knowledge synthesis Create knowledge tools Determine the know/do gap Adapt knowledge to local context Assess barriers/facilitators to use Select, tailor, implement Monitor knowledge use Evaluate outcomes Sustain knowledge use | Manalili and Santana [ |
Normalization Process Theory (NPT) [ | Implementation theory: specifies causal mechanisms | Manalili and Santana [ |
PROM paper or electronic patient-reported outcome measure, ePROM electronic patient-reported outcome measure, and ePREM electronic patient-reported experience measure
Barriers, enablers, and implementation strategies used in case studies
| Country | Clinical setting | Implemented PROMs or PREMs | IS framework or theory | Implementation barriers identified | Implementation enablers identified | Implementation strategies employed |
|---|---|---|---|---|---|---|
| Eastern Canada [ | Chronic pain network including primary care, rehabilitation care, and hospital-based care | ePROMs | Primary care: • Well-defined clinical process: barriers at clinician level • Lack knowledge on how to interpret pain PROMs Tertiary care: • Variability in care process: multilevel barriers • Confidentiality concerns • Technology comfort • Perceived increase in workload and time to review PROMs • Perception PROMs may decrease patients’ satisfaction with care • PROMs not integrated in electronic health record • Cost and time to implement | • Existing PROM system easy for clinicians to use and accessible on all forms of devices • Rapid access to PROM results • Selected PROMs that are easy to complete and interpret • Top-down decision from clinic leadership to implement • Created business plan with health system and moved money to clinic budgets • Opinion leader support | • Identify barriers with clinic • Map observed barriers to evidence-based strategies • Training workshop with clinic team (half day) • Local opinion leader with PROM knowledge provided coaching • Educational materials • Onsite tech support • Workflow redesign support • Support to help patients complete PROMs • Examine potential cost savings by triaging patients more efficiently | |
| Australia [ | Medical oncology outpatient department | Paper and electronic PROMs | Integrated Framework for Promoting Action on Research Implementation in Health Services (i-PARIHS) [ | • Gaps in infrastructure • Varying workflows • Clinics needed more time than anticipated to implement • Staff felt pressured with competing priorities • Past negative experiences with innovations | • Dedicated facilitator (implementation support role) • Rapid access to PROM results • Research funding • Peer champions for PROMs emerged naturally | • Stakeholder engagement about barriers and context assessments • Workflow assessment and redesign assistance • Training/information resources • Technical support • Rapid cycle testing • Audit and feedback to clinics |
Netherlands [ | Multiple pediatric and adult health conditions | ePROMs | Consolidated Framework for Implementation Research (CFIR) [ | • Some clinics undergoing too many change initiatives • PROMs not integrated in EHR • Stakeholders did not see relative advantage of PROMs • Compatibility • No organizational incentives | • Clinicians perceived value • Strong evidence PROMs improve clinical outcomes • Existing online portal is user friendly for patients and clinicians • Existing automated PROM reminders • Existing automatic and direct access to PROM results and visualization for clinicians • Existing ability for multidisciplinary clinic team members to customize PROMs based on patient age, health conditions, etc • Existing clinician self-efficacy | • Stakeholder engagement • PROM integration in EHR • Provided PROM recommendations based on patients’ age and condition • Training • Implementation support team available to all clinics • Annual evaluation meeting with clinics • Reflecting and evaluating on what worked and did not work |
| Western Canada [ | Primary care: implementing ePREMs for quality improvement | ePREMs | Knowledge to Action (KTA) [ CFIR [ Normalization Process Theory (NPT) [ | • Unclear stakeholder preferences and barriers • Unclear what optimal implementation strategies will be for PREMs and whether they differ from PROM strategies | • Research grant support • Collaboration with quality improvement specialists • National policy change: Primary care patient’s medical home encourages patient-centered communication and patient surveys to evaluate effectiveness of practice’s services | • Stakeholder engagement to identify barriers (interviews with clinic teams) • Categorize barriers with theory and map to evidence-based implementation strategies • Training clinic teams • Stakeholder engagement • Onsite coaching • Plan-Do-Study-Act rapid testing cycles • Audit and feedback to clinics • Process evaluation |
Fig. 3PROM/PREM barriers and enablers in case studies
Fig. 4Implementation strategies used in case studies, shown by implementation stage
Comparison of implementation science frameworks used for evaluation
| IS evaluation framework | Construct to evaluate | Construct definition | Similar construct | Case studies |
|---|---|---|---|---|
| Proctor’s outcomes [ | Acceptability | Extent to which implementation stakeholders perceive innovation to be agreeable or palatable | Satisfaction | Ahmed et al. [ Roberts et al. [ van Oers et al. [ |
| Appropriateness | Perceived fit, relevance, or compatibility of innovation for given practice setting | Compatibility, usefulness | ||
| Adoption | Intention, initial decision, or action to employ innovation by service settings (proportion and representativeness) | Uptake | ||
| Feasibility | Extent to which innovation can be successfully used or carried out within given setting | Practicability | ||
| Reach/penetration | Extent to which target population is reached | Service penetration | ||
| Fidelity | Degree to which innovation or implementation strategy delivered as intended | Adherence | ||
| Costs | Financial impact of innovation, including costs, personnel, and clinic and patient time necessary for treatment delivery, or cost of implementation strategy | Cost–benefit, cost-effectiveness | ||
| Sustainability | Extent to which innovation is maintained as intended and/or institutionalized within service setting’s ongoing operations | Maintenance, institutionalized | ||
Reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) [ | Reach | Extent to which target population is reached | Penetration | Manalili and Santana [ |
| Effectiveness | Impact of innovation on important outcomes, including potential negative effects, quality of life, and economic | |||
| Adoption | Absolute number, proportion, and representativeness of settings and intervention agents (people who deliver the program) who are willing to initiate a program | Uptake | ||
| Implementation | • At setting level: intervention agents’ fidelity to various elements of innovation’s protocol, including consistency of delivery as intended and time and cost of intervention • At individual level: use of intervention strategies | |||
| Maintenance | • At setting level: extent to which an innovation becomes institutionalized/part of routine practices and policies • At individual level: Long-term effects of innovation on outcomes 6+ months after most recent contact | Sustainability, institutionalized |
Implementation science metrics for evaluating PROM implementation initiatives in routine care settings
| Implementation science construct | Evaluating | Evaluating the |
|---|---|---|
| Acceptability | • % willing to recommend PROMs to other patients • % reporting PROMs helpful in discussing symptoms/symptom management • % reporting ease of use and comprehensibility for PROMs and technology systems | • Stakeholder perceptions of acceptability of implementation strategies (e.g., PROM training session is appropriate length) • Barriers and enablers for implementing PROMs • Related contextual factor: organizational readiness for change |
| Appropriateness | • PROM fit with patient population (e.g., literacy level, technology comfort, language(s), font size, culturally appropriate, meaningful for clinical condition) • PROM fit for clinic team (e.g., PROM easy to interpret, meaningful for clinical care, integrated in electronic health record system, linked clinical decision support) • PROM fit with clinic culture and values • Perceived relative advantage of PROMs vs. usual care • Leadership support for PROMs | • Stakeholder perceptions of clinic needs and resources for implementing PROMs • Fit of potential implementation strategies for specific clinics, their needs and resources, clinic team members, and patient population • Leadership support for implementation strategies (e.g., providing space and time for clinic team to receive training) |
| Feasibility | • Extent to which technology or electronic health record can be developed or modified to administer PROMs and visualize results in a meaningful way for clinicians • If collecting PROMs from home, feasibility testing considers underserved patient groups’ needs and access to internet and habits (or alternative data collection methods like interactive voice response offered) • Consent rate > 70% (if applicable) • How many and which items are missed or skipped (and identifiable patterns) • Length of time for patients to complete the PROM, comprehensibility • Rates of technical issues • Dropout rate for patients • PROM characteristics (e.g., literacy demand, number of items, preliminary psychometric properties if used in new population, validity and reliability evidence for population) | • “Action, actor, context, target, time (AACTT)” framework [ • % clinics completing at least one implementation activity or phase (and/or all activities and implementation phases) • Rates of technical issues for clinics • Stakeholder perceptions of which implementation strategies are possible • Stakeholder perceptions of what to include in PROM training session • Pilot study or rapid cycle testing to determine if implementation strategy is possible (e.g. whether specific workflow change possible in a clinic) • Which implementation activities were completed vs. skipped |
| Adoption | • % of clinics advancing to administering PROMs routinely • Representativeness of clinics willing to initiate PROMs • Underserved patient groups (e.g., older patients) complete PROMs at similar rates to clinic average | • Dropout rate for clinics • Representativeness of clinics completing implementation activities • Stakeholder perceptions and observations on which implementation support strategies were/were not effective in a clinic, and why • How and why clinics operationalized implementation strategies • Minor changes made to implementation strategies to fit local conditions or context (if major changes, see fidelity below) • StaRI reporting guidelines for implementation strategies [ |
| Reach/penetration | • % of patient panel completing ≥ 1 PROM during defined time interval (denominator chosen appropriately: all patients with an in-person visit during time interval, etc.) • % of missing data during defined time interval (with appropriate denominator) • Informed missingness (correlated with patient demographics) • Average # PROMs completed per patient during interval | • % of clinic team participating in implementation strategies • % of clinic team attending training • % of clinic team reporting training helped them understand new role and how to implement in their workflow • Clinicians: % reporting self-efficacy for using PROMs after training |
| Fidelity | • Consistency of PROMs completed by patients (e.g., 80% PROM completion rate for clinic) • % of clinicians who review PROMs with patients during visits • How and why clinics adapted the innovation (e.g., changed PROM timeframe for items) • FRAME framework for reporting adaptions to interventions [ | • FIDELITY framework [ • How and why clinics or support personnel adapted implementation strategies (e.g., changed the PROM training format or content) • % of clinics completing all implementation activities |
| Cost | • Financial, personnel, and time costs to administer and review PROMs on routine basis • Technology costs | • Financial, personnel, technology, and time costs to implement PROMs • Cost of Implementing New Strategies (COINS) [ |
| Sustainability | • Extent to which PROMs become normalized and routinized in a clinic’s workflow • Stakeholder perceptions • Periodically assess whether updates to PROMs are needed | • Routine data-informed feedback to clinic on PROM completion rates, missing data, and informed missingness • Provide additional implementation support to identify and overcome new or ongoing barriers (if needed) • Retraining or “booster” training or train new staff (if needed) |
Bold and italic font show the important distinction between evaluating perceptions of the innovation (PROMs/PREMs) vs. evaluating implementation strategies
ePROM electronic patient-reported outcome measure, AACTT action, actor, context, target, time framework, StaRi standards for reporting implementation studies guidelines, FRAME framework for reporting adaptations and modifications-enhanced, COINS Cost of Implementing New Strategies (COINS) scale
General strategies for implementing PROMs/PREMs in routine care (derived from Normalization Process Theory [NPT] [45, 46])
| Core constructs from NPT [ | 1. Coherence: | 2. Cognitive participation: Engage stakeholders in communities of practice | 3. Collective action: | 4. Reflexive monitoring: Evaluate understanding of routine PROMs/PREMs use |
|---|---|---|---|---|
| Overlap with relevant domains from widely used implementation science frameworks | Feasibility (stakeholder perceptions) | (e.g., engaging, opinion leaders, internal implementation leaders, champions, external change agents) | Cost | |
| Implementation strategies identified in case studies | • Stakeholder engagement • Provide evidence about clinical validity of PROMs/PREMs | • Training workshops • Workflow redesign • Implementation support team | • Context assessments • Technology support • Practice facilitator | • Annual evaluation meetings with clinics • Audit and feedback |
PROM electronic patient-reported outcome measure, PREM electronic patient-reported experience measure, CFIR consolidated framework for implementation research, i-PARIHS integrated framework for promoting action on research implementation in health services, KTA knowledge to action, TDF theoretical domains framework, NPT normalization process theory, RE-AIM Reach effectiveness, adoption, implementation, maintenance framework