| Literature DB >> 32870158 |
Rocio de la Vega1, Lee Ritterband2, Tonya M Palermo1.
Abstract
BACKGROUND: Digital health interventions have demonstrated efficacy for several conditions including for pediatric chronic pain. However, the process of making interventions available to end users in an efficient and sustained way is challenging and remains a new area of research. To advance this field, comprehensive frameworks have been created.Entities:
Keywords: adolescents; chronic pain; eHealth; implementation science; mHealth; mobile health; mobile phone
Mesh:
Year: 2020 PMID: 32870158 PMCID: PMC7492980 DOI: 10.2196/19898
Source DB: PubMed Journal: J Med Internet Res ISSN: 1438-8871 Impact factor: 5.428
Summary of implementation outcomes using the Reach, Effectiveness, Adoption, Implementation, and Maintenance framework.
| Domains | Metrices | Results | Sources | |
|
| ||||
|
| User level | Final sample out of planned sample | n=143/120 (119%) | Administrative tracking data |
|
| User level | Consents out of eligible referred children |
Total N=143/181 (79%) Clinic 1=4/4 (100%) Clinic 2=10/15 (67%) Clinic 3=4/5 (80%) Clinic 4=6/7 (86%) Clinic 5=15/17 (88%) Clinic 6=45/55 (82%) Clinic 7=15/20 (75%) Clinic 8=44/68 (76%) | Administrative tracking data |
|
| User level | TEIa mean score and percentage above 27, the moderate acceptability cutoff | Mean 30.7, 86% moderate-to-high acceptability | Patient survey |
|
| ||||
|
| User level | Change in treatment outcomes | Similar change in pain-related disability in both groups. Greater engagement associated with greater improvement in pain-related disability (Cohen | Patient survey |
|
| ||||
|
| Organization level | Percentage of invited clinics agreeing to participate | All clinics agreed (100%) | Administrative tracking data |
|
| Organization level | Percentage of participating clinics referring patients | All clinics referred patients (100%) | Administrative tracking data |
|
| ||||
|
| User level | 1 module (engagement) | N=54 (74%) | Back-end data |
|
| User level | 4+ modules (adherence) | N=29 (40%) | Back-end data |
|
| User level | Number of days self-monitoring pretreatment to posttreatment | Mean 30.5 (SD 29.4); median=19; range 2-56 | Back-end data |
|
| Organization level | Provider attitudes toward the app (1 “Strongly disagree” to 5 “Strongly agree”) |
Helpful to provide CBTb: mean 4.6 Patients would benefit: mean 4.6 Improves quality of care: mean 4.5 Better use of resources: mean 4.4 Fills an important need: mean 4.3 | Provider survey |
|
| Organization level | Actual costs compared with projected costs | The original budget was exceeded by 7% | Budget data |
|
| ||||
|
| User level | New patients using the app during the maintenance period and clinic they were referred from |
Total N=56 Clinic 1=7 Clinic 2=0 Clinic 3=3 Clinic 4=6 Clinic 5=26 Clinic 6=3 Clinic 7=6 Clinic 8=5 | Administrative tracking data; app back-end data |
|
| Organization level | Percentage of clinics agreeing to continue making referrals | All clinics agreed (100%) | Administrative tracking data |
|
| Organization level | Percentage of clinics making referrals | 7/8 clinics (88%) made referrals | Administrative tracking data |
|
| Organization level | Providers: “I will encourage my patients to use the app after the study is over” | 92% agreed or strongly agreed with the item | Provider survey |
aTEI: Treatment Evaluation Inventory.
bCBT: cognitive behavioral therapy.
Summary of implementation using the Behavior Interventions using Technology framework.
| Domains | Metrices | Results | Sources | |
|
| ||||
|
| User level | TEIa mean score and percentage above the acceptability cutoff (>27) | Mean 30.7, 86% moderate-to-high acceptability | Patient survey |
|
| Organization level | Provider attitudes toward the app (1 “Strongly disagree” to 5 “Strongly agree”) |
Helpful to provide CBTb: mean=4.6 My patients would benefit: mean 4.6 Improves the quality of care: mean 4.5 Better use of resources: mean 4.4 Fills in an important need: mean 4.3 | Provider survey |
|
| ||||
|
| User level | Percentage of participants who downloaded the app | n=68/73 (93%) | Back-end data |
|
| User level | Percentage of participants who used WMMc after first log-in | n=68 (100%) | Back-end data |
|
| User level | Percentage of participants who completed ≥1 module | n=54 (74%) | Back-end data |
|
| ||||
|
| User level | Score on app perceptions (0 “Did not like it” to 5 “Liked it very much”) |
Appearance: mean 3.6 Navigation: mean 3.9 Theme: mean 3.7 Content: mean 3.3 | Patient survey |
|
| ||||
|
| Organization level | Percentage of clinics agreeing to continue making referrals | All clinics agreed (100%) | Administrative tracking data |
|
| Organization level | Percentage of clinics making referrals | 7/8 (88%) | Administrative tracking data |
|
| Organization level | Final sample out of planned sample | n=143/120 (119%) | Administrative tracking data |
|
| User level | Number of technical issues reported or complaints | n=0 (0%) | Administrative tracking data |
|
| User level | Participants comments | Not enough space to download the app, n=2 (3%) | Patient survey |
|
| Organization level | Providers comments | It was easy to refer patients; WMM is something useful that can be integrated in the practice | Provider survey |
|
| ||||
|
| User level | Number of days tracking symptoms | Mean 30.5 (SD 29.4); median=19; range 2-56 | Back-end data |
|
| User level | Number of participants completing the treatment | n=29 (40%) | Back-end data |
|
| ||||
|
| Organization level | App development costs | As planned | Budgets |
|
| Organization level | Making WMM publicly available | Exceed budget by 7% | Budgets |
|
| ||||
|
| Organization level | New patients using the app during the referral period and clinic they were referred from |
Total N=56 Clinic 1=7 Clinic 2=0 Clinic 3=3 Clinic 4=6 Clinic 5=26 Clinic 6=3 Clinic 7=6 Clinic 8=5 | Administrative tracking data; back-end data |
|
| ||||
|
| Organization level | Referrals made | 100% of the clinics agreed; 88% kept referring | Administrative tracking data |
|
| Organization level | “I will encourage my patients to use the app after the study is over” | 92% agreed or strongly agreed with the item | Provider survey |
aTEI: Treatment Evaluation Inventory.
bCBT: cognitive behavioral therapy.
cWMM: WebMAP Mobile.
Lessons learned: barriers and facilitators to assess the domains of each framework and recommendations for future studies.
| Framework and domains, Barriers and facilitators | Recommendations and considerations | ||
|
| |||
|
|
| ||
|
|
| Organization level (number of consents obtained out of eligible referrals received), and overall N were easy to collect because our trial involved user level referral and tracking. | CONSORT (The Consolidated Standards of Reporting Trials) flow diagram will provide these data. For nonresearch contexts, tracking the users approached, interested, and participating would be needed. |
|
|
| ||
|
|
| Effectiveness data from primary and secondary outcome measures was facilitated by web-based survey administration. | This domain is almost always assessed in research studies. However, it might be challenging to assess in nonresearch contexts. Low-demand approaches, such as voluntary web-based surveys could help gather information. |
|
|
| ||
|
|
| The number of centers willing to participate and participating was easily assessed with administrative tracking. We originally planned to assess the number of referrals out of the eligible participants. However, clinics were unable to provide information on the age range of their patients or how many had chronic pain; thus, the number of eligible patients is unknown. | When defining adoption, it would be key to understand availability of information required to assess this domain. If it is not, alternative metrics should be planned and collected from the beginning of the intervention. |
|
|
| ||
|
|
| Being able to access and interpret back-end data for the app required working with the developers and having a data analyst transforming the databases. | It is important to plan the human resources needed and budget costs in advance. |
|
|
| ||
|
|
| We originally planned to track number of referral flyers given per clinic, but providers did not use the flyers consistently. Thus, we used alternative metrics: using app data tracking to understand number of downloads and times the app was used. | Having access to the chosen metrics should be ensured from the beginning. Ideally, objective and subjective measures (eg, asking participants if they are still using the intervention and being able to track usage with back-end data) should be collected. |
|
| |||
|
|
| ||
|
|
| Collecting web-based acceptability feedback facilitated this assessment as it was efficient and low burden. However, we were limited to quantitative data to understand perceptions. | Web-based surveys are recommended to assess acceptability, with the same considerations that the rich detail of user perceptions may not be possible to gather in this manner. |
|
|
| ||
|
|
| Information retrieved from the app needed several steps of cleaning and restructuring databases (and the involvement of personnel with 3 different profiles and skill sets: engineers, data manager, and research scientist) before being interpretable. The costs of this process may be a barrier if unplanned for. | Working with engineers and developers from the creation of the intervention and having a dialog about the information (metrics) needed is key to ensure that adoption can be properly assessed. Budget can be a barrier because it is often expensive to obtain some metrics in a “user friendly” way (eg, the systems may provide information in a way that is difficult to understand by the lay user). |
|
|
| ||
|
|
| A closed-ended patient survey allowed to collect perceptions about WMMc appearance, navigation, theme, and content. | For appropriateness, it would be ideal to be able to complement web-based surveys with additional qualitative assessments if costs permit their inclusion. |
|
|
| ||
|
|
| Technical issues and complaints were carefully tracked but it is possible that additional problems were unreported. | Participants should be able to easily report technical problems to maximize the chances of reporting. A phone number or contact email (that is attended) should be provided to the participants and included in the app or website. |
|
|
| ||
|
|
| Resources needed to use back-end data also apply for this metric. | Defining what is “intended” and “actual” use beforehand would allow decisions to be made on the metrics to use and to plan on resources if back-end data are needed to be retrieved. |
|
|
| ||
|
|
| We decided to compare the planned budget and the real expenses as a way to determine efficiency of the resources. | Deciding whether implementation costs were adequate can be difficult without having a reference and is study specific. |
|
|
| ||
|
|
| At the | Deciding how to assess the extent to which the practice is integrated within the system can be challenging. If unknown, a pilot study could help inform what information is feasible to obtain from clinics or organizations where the intervention is being implemented. |
|
|
| ||
|
|
| Using back-end data and checking the activation codes used, we were able to determine the percentage of clinics making referrals during the maintenance period. | Ongoing use of the intervention after the study ends can be a challenging domain to assess, because contact with the participating centers should be minimal. Collecting information in a passive way (eg, tracking use with back-end data) or with brief web-based surveys would be preferred. |
aRE-AIM: Reach, Effectiveness, Adoption, Implementation, and Maintenance.
bBIT: Behavior Interventions using Technology.
cWMM: WebMAP Mobile.