| Literature DB >> 28663162 |
Susan Michie1, Lucy Yardley2,3, Robert West4, Kevin Patrick5,6, Felix Greaves7,8.
Abstract
Devices and programs using digital technology to foster or support behavior change (digital interventions) are increasingly ubiquitous, being adopted for use in patient diagnosis and treatment, self-management of chronic diseases, and in primary prevention. They have been heralded as potentially revolutionizing the ways in which individuals can monitor and improve their health behaviors and health care by improving outcomes, reducing costs, and improving the patient experience. However, we are still mainly in the age of promise rather than delivery. Developing and evaluating these digital interventions presents new challenges and new versions of old challenges that require use of improved and perhaps entirely new methods for research and evaluation. This article discusses these challenges and provides recommendations aimed at accelerating the rate of progress in digital behavior intervention research and practice. Areas addressed include intervention development in a rapidly changing technological landscape, promoting user engagement, advancing the underpinning science and theory, evaluating effectiveness and cost-effectiveness, and addressing issues of regulatory, ethical, and information governance. This article is the result of a two-day international workshop on how to create, evaluate, and implement effective digital interventions in relation to health behaviors. It was held in London in September 2015 and was supported by the United Kingdom's Medical Research Council (MRC), the National Institute for Health Research (NIHR), the Methodology Research Programme (PI Susan Michie), and the Robert Wood Johnson Foundation of the United States (PI Kevin Patrick). Important recommendations to manage the rapid pace of change include considering using emerging techniques from data science, machine learning, and Bayesian approaches and learning from other disciplines including computer science and engineering. With regard to assessing and promoting engagement, a key conclusion was that sustained engagement is not always required and that for each intervention it is useful to establish what constitutes "effective engagement," that is, sufficient engagement to achieve the intended outcomes. The potential of digital interventions for testing and advancing theories of behavior change by generating ecologically valid, real-time objective data was recognized. Evaluations should include all phases of the development cycle, designed for generalizability, and consider new experimental designs to make the best use of rich data streams. Future health economics analyses need to recognize and model the complex and potentially far-reaching costs and benefits of digital interventions. In terms of governance, developers of digital behavior interventions should comply with existing regulatory frameworks, but with consideration for emerging standards around information governance, ethics, and interoperability. ©Susan Michie, Lucy Yardley, Robert West, Kevin Patrick, Felix Greaves. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2017.Entities:
Keywords: behavioral medicine; eHealth; health behavior; mHealth; mobile applications; psychological theory
Mesh:
Year: 2017 PMID: 28663162 PMCID: PMC5509948 DOI: 10.2196/jmir.7126
Source DB: PubMed Journal: J Med Internet Res ISSN: 1438-8871 Impact factor: 5.428
Challenges for developing and evaluating digital interventions targeting behavior change.
| Topics | Challenges |
| Pace and efficiency | Rapid technological change and iterative development cycles make it necessary to continually update and adapt interventions. |
| Existing development and evaluation cycles are slow and unsuited to dynamic systems and rapidly changing contexts. | |
| Efficient, continuing relationships between academics and intervention developers are needed for implementation, continued development, and evaluation. | |
| Engagement | Engagement with digital interventions is often too limited to support behavior change. |
| Engagement is multidimensional and cannot be evaluated simply by DBCIa usage. | |
| Engagement with DBCIs may be unequal between different groups and at risk of reinforcing disparities or inequalities. | |
| Theory | Often, there is a lack of clarity around the mechanisms through which DBCIs have their effect. |
| Methods of characterizing intervention components, mode of delivery, and contexts that characterize their essential features are required but limited. | |
| Evaluation of effectiveness | Controlling the testing environment is made problematic by the ready availability of alternative interventions. |
| It is difficult to specify comparator interventions or control conditions that allow meaningful evaluation of the intervention of interest. | |
| Better methods for structuring and analyzing very large, dynamic, and heterogeneous data sets are needed. | |
| Reach and engagement can be low. | |
| The complex multi-component nature of interventions requires an iterative design and testing cycle. | |
| Evaluation of cost-effectiveness | There is a lack of techniques for economic and cost-effectiveness evaluation across the digital development, deployment, and delivery cycle. |
| Funding mechanisms are not aligned with the digital model of development, implementation, iterative improvement, and evaluation. | |
| Regulation, ethics, and information governance | There are competing commercial and ethical demands on data ownership and intellectual property. |
| There are emerging and different standards around ethical or institutional review in the biomedical, psychological, and digital development communities. | |
| There are uncertain quality standards and regulatory processes for digital interventions (with standards either in development or inappropriately adapted from other contexts). |
aDBCI: Digital behavior change interventions.
Summary of recommendations according to topic.
| Achieving rapid and efficient development | Understanding and promoting engagement | Advancing models and theories | Evaluating effectiveness | Evaluating cost-effectiveness | Ensuring regulatory, ethical, and information governance |
| Consider adopting methods from engineering and other data-intensive domains in the development cycle. | Specify and establish empirically what constitutes “effective engagement” for each DBCIa, that is, sufficient engagement to achieve the intended outcomes. | Use the large amounts of real-time, ecologically valid data generated by DBCIs to test and advance models and theories of behavior change. | Evaluate at all phases in the development cycle. | At every stage, including concept development, identify all the relevant future costs and benefits. | Ensure compliance with appropriate ethics or institutional review board processes. |
| Use Bayesian and related approaches to improve the predictive modeling capabilities of DBCIs. | Identify and develop valid and efficient combinations of objective and subjective measures to build and test multidimensional models of engagement. | Develop methods able to efficiently analyze large, complex data sets to test dynamic theoretical propositions and allow personalization of DBCIs. | Design evaluations for generalizability. | Take account of projected uptake as well as reach. | Identify and adhere to regulatory processes that may be required for digital medical devices. |
| Leverage advances in data science such as machine learning, but ensure that human input is retained as needed. | Develop DBCIs with a person-centered and iterative approach, using mixed methods to progressively refine the DBCI to meet user requirements. | Specify the circumstances in which a proposed mechanism of action of a DBCI will produce a targeted effect and build an ontology to organize knowledge resulting from this. | Use methods of DBCI evaluation that capitalize on their unique characteristics. | Select a modeling framework appropriate for the complexity of the projections. | Ensure compliance with national standards for data handling, sharing, and interoperability, where appropriate. |
| Develop DBCIs using a modular approach. | Use features of DBCIs to optimize control and access rich data streams. | Separately evaluate societal, personal, and health care cost-effectiveness. | Provide clear and transparent information on how data from the intervention will be used and shared. | ||
| Support interdisciplinary research collaborations and transdisciplinary thinking. | Choose comparators that minimize contamination. |
aDBCI: Digital behavior change interventions.