| Literature DB >> 30681966 |
Eric DA Hermes1,2, Aaron R Lyon3, Stephen M Schueller4, Joseph E Glass5.
Abstract
Behavioral intervention technologies (BITs) are websites, software, mobile apps, and sensors designed to help users address or change behaviors, cognitions, and emotional states. BITs have the potential to transform health care delivery, and early research has produced promising findings of efficacy. BITs also favor new models of health care delivery and provide novel data sources for measurement. However, there are few examples of successful BIT implementation and a lack of consensus on as well as inadequate descriptions of BIT implementation measurement. The aim of this viewpoint paper is to provide an overview and characterization of implementation outcomes for the study of BIT use in routine practice settings. Eight outcomes for the evaluation of implementation have been previously described: acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainability. In a proposed recharacterization of these outcomes with respect to BIT implementation, definitions are clarified, expansions to the level of analysis are identified, and unique measurement characteristics are discussed. Differences between BIT development and implementation, an increased focus on consumer-level outcomes, the expansion of providers who support BIT use, and the blending of BITs with traditional health care services are specifically discussed. BITs have the potential to transform health care delivery. Realizing this potential, however, will hinge on high-quality research that consistently and accurately measures how well such technologies have been integrated into health services. This overview and characterization of implementation outcomes support BIT research by identifying and proposing solutions for key theoretical and practical measurement challenges. ©Eric DA Hermes, Aaron R Lyon, Stephen M Schueller, Joseph E Glass. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 25.01.2019.Entities:
Keywords: behavior therapy; behavioral intervention technology; diffusion of innovation; implementation; internet; mobile applications; outcome assessment (health care); review; technology; telemedicine; translational medical research
Mesh:
Year: 2019 PMID: 30681966 PMCID: PMC6367669 DOI: 10.2196/11752
Source DB: PubMed Journal: J Med Internet Res ISSN: 1438-8871 Impact factor: 5.428
Examples of behavioral intervention technology programs.
| Behavioral intervention technologies program | Program objective | Platform | Evidence |
| BlueStar or WellDoc | Diabetes management | Mobile | [ |
| MoodGym | Depression | Web | [ |
| Sleep Healthy Using the Internet (SHUTi) | Insomnia | Web | [ |
| PTSD Coach and PE Coach | Posttraumatic stress disorder symptom tracking and treatment support | Mobile | [ |
| reSET or Therapeutic Educational System | Substance use disorders | Mobile | [ |
Figure 1Continuum of support for delivering behavioral intervention technologies (BITs).
Characterization of behavioral intervention technology implementation outcomes.
| Outcome and definition [ | Level of analysis in BITa studies | Measurement objective and process | Example of BIT outcome measurement | |
Perception among stakeholders that a given evidence-based practice is useful or satisfactory | Individual provider, consumer, or administrator | Objective: assessment of the extent to which BIT aligns with expectations of an agreeable user experience Process: survey, interview, focus group, and direct observation usability testing | Mares et al (2016): qualitative methods used to assess initial consumer and provider expectations [ Milward et al (2017): focus groups assessed the extent to which features were acceptable in terms of content, features, and design [ | |
Intention, decision, or initiation to use an evidence-based practice | Individual provider, consumer, or administrator | Objective: assessment of Process: passive data collection of BIT use [ | Gilbody et al (2016): to measure consumer-level adoption, log-in records were used to identify number of participants who accessed programs [ | |
Perceived fit, relevance, or compatibility of the evidence-based practice to a given context | Individual provider, consumer, or administrator Organization | Objective: assessment of perceived BIT fit with the context Process: survey, interview, focus group, direct observation usability testing, and workflow studies | Lyon et al (2016): evaluated school-based practitioner workflows and current technology use practices to determine the appropriateness of a digital measurement feedback system and identified areas for BIT redesign [ | |
Extent to which an evidence-based practice can be successfully used or conducted within a given context | Individual provider, consumer, or administrator Organization | Objective: in vivo assessment of the extent to which a BIT can be used by consumers or providers in a specific setting Process: Passive data collection of BIT use, survey, and structured observation studies | Kumar et al (2018): using program use data collected via BIT, the feasibility of implementing a mobile app for consumers and a provider-facing dashboard was tested in 4 outpatient clinics [ | |
Extent to which implementation results in an evidence-based practice being delivered as intended | Individual consumer or provider Organization | Objective: measuring adherence, dose, or quality of BIT use with respect to the developer’s intentions for use Process: passive data collection of BIT use | Calear et al (2013): reported high adherence associated with improved clinical outcomes in a hybrid implementation effectiveness study [ Sineath et al (2017): developed and tested a fidelity protocol for a diet and lifestyle monitoring BIT that involved coaching [ | |
Costs associated with implementing an evidence-based practice | Organization | Objective: intervention development costs, maintenance and versioning costs, implementation strategy costs, and operational costs Process: cost analysis, interviews, and budgetary or administrative databases | Quanbeck al (2018): measured implementation strategy costs for implementation coaching time and site visits needed to help 3 organizations integrate BIT into practice [ | |
The integration of an evidence-based practice within a service setting (organization) and its subsystems | Organization | Objective: measuring the number of consumers or providers using BITs among those eligible or trained to engage in BIT Process: passive data collection of BIT use and electronic health record data | Titov et al (2015): measured the proportion of individuals in a defined consumer population completing lessons in 4 different BITs [ | |
The extent to which a newly implemented evidence-based practice is maintained or institutionalized within a service setting’s ongoing, stable operations | Administrators Organization | Objectives: measuring ongoing BIT use, change in funding streams, saturation within the organization, and inclusion in routine reports Process: passive data collection of BIT use, administrative or budgetary databases, oversight committee reports, and policy and training documents | Carlfjord et al (2013): measured continued BIT delivery after active implementation [ Quanbeck et al (2018): measured whether or not the health system continued to offer BIT after research funding for an implementation trial ended [ | |
aBIT: behavioral intervention technology.