Literature DB >> 16722583

Designing theoretically-informed implementation interventions: fine in theory, but evidence of effectiveness in practice is needed.

Onil Bhattacharyya1, Scott Reeves, Susan Garfinkel, Merrick Zwarenstein.   

Abstract

The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG) authors assert that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem, and suggest that use of explicit theory offers an effective solution. This assertion is problematic for three primary reasons. First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence. Second, it is not clear how to translate theory reliably into intervention design, which undoubtedly involves the diluting effect of "common sense." Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy. To determine whether explicitly theory-based interventions are, on average, more effective than those based on implicit theories, pragmatic trials are needed. Until empirical evidence is available showing the superiority of theory-based interventions, the use of theory should not be used as a basis for assessing the value of implementation studies by research funders, ethics committees, editors or policy decision makers.

Entities:  

Year:  2006        PMID: 16722583      PMCID: PMC1436014          DOI: 10.1186/1748-5908-1-5

Source DB:  PubMed          Journal:  Implement Sci        ISSN: 1748-5908            Impact factor:   7.327


Introduction

The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG) authors assert [1,2] that a key weakness in implementation research is the unknown applicability of a given intervention outside its original site and problem. They argue that more widely interventions (and imply that more interventions) should be created by: (1) using explicit behavioral theories to quantitatively characterize the determinants of professionals' behavior choices, (2) identifying predictors that are common across many settings and problems, and (3) designing interventions based on the most powerful predictors. Though this view is logical, it is problematic, and not based on empirical evidence.

First, the presence of an underlying theory does not necessarily ease the task of judging the applicability of a piece of empirical evidence

Judgment on the wider applicability of a piece of evidence proceeds by induction, and is not mechanistically related to the underlying theory from which grew the empirical study. Behavioral theory is possibly less predictive of behavior than physiological theory is of physiology. It is further diluted in its predictive power by contextual differences, such as health service design and medical cultural differences whose effects on choice cannot be directly translated into the internal psychological forces which are the subject of behavioral theory. We should also bear in mind that the physiological theory predicting a cardio-protective physiological effect for hormone replacement therapy was so convincing that millions of women were prescribed it, but in empirical studies it failed to achieve the predicted benefits, and indeed resulted in substantial harm [3]. Formal theory may be an unreliable predictor of outcome even within the theorized group, and thus a poor framework for extrapolation of outcome to other settings and subjects.

Secondly, it is not clear how to translate theory rigidly into intervention design

There is no reproducible, algorithmically operationalised process for taking predictor variables from a quantitative theory based descriptive study and turning them into elements of an intervention. Since this process will be diluted by human judgment, which is influenced by many factors other than the theory (i.e., knowledge of context and personal prior beliefs), we believe that theory is contributing less to this part of the process than it appears. Theory could be merely a cover for common sense, or a grounded approach to designing an intervention.

Thirdly, there are many theories, formal and informal, and it is not clear why any one should be given primacy

Theories overlap and contradict each other. Even theoreticians are forced to distill from the multitude of testable formal theories relevant to professional behavior change a common core of domains; in itself a new, meta-theory, but because of its reverse engineering, based upon little more than common sense [4]. Many formal theories and concepts in the field of psychology had already been described recognizably using lay terms and ideas, suggesting that these ideas are accessible without theories. We live in our own psyche, observe ourselves, reflect on our situation, and ask our colleagues why they make choices. Others observe our choices, directly, through inquiry or by analysis of routine data, and speculate on its determinants. Though not particularly rigorous, all these approaches are plausible sources of informal 'theories.' As such, they can explain professional behavior and inspire ideas for the design of interventions to change behavior, which can then be tested.

How could we decide whether formal theory offers the best approach for designing interventions to change behavior?

Abstract arguments on this question will continue inconclusively [5]. On the one hand, theory development may lead to a greater meta-understanding and move the field forward. On the other hand, the phenomena being studied may be so complex that all this work will not lead to theories with greater predictive power than implicit theory or "common sense." The exercise may be so time-consuming (e.g., the 20 to 80 years spent conceptualizing cognitive behavioural theory is Eccles et al.'s example [2]) that it may not be a particularly efficient way to proceed. We need an empirical answer to Eccles et al.'s assertion that "better evaluations of what does and does not work in implementation research will only be possible with the explicit use of theoretically informed interventions." We need to know, in practice, whether interventions to change professional behavior, designed using formal theory applied in a predefined and reproducible manner, are more effective at changing the targeted behavior than alternative, less theory bound approaches. Given a sufficient set of replicates, across a reasonable range of settings and professional behavior choices, we can reach an empirical answer. One such randomized trial is underway (TRYME protocol, Francis et al, in submission). Until there is empirical evidence that interventions designed using theories are generally superior in impact on behavior choice to interventions not so designed, the choice to use or not use formal theory in implementation research should remain a personal judgment. Research funders, ethics committees, systematic reviewers, editors, and policy decision makers should not in any way restrict this choice.

Competing interests

The author(s) declare that they have no competing interests.

Authors' contributions

OB wrote the first draft, MZ suggested the idea for the paper and commented on all of the drafts, SR wrote the second draft, and SG modified subsequent drafts. All authors read and approved the final manuscript.
  4 in total

1.  Making psychological theory useful for implementing evidence based practice: a consensus approach.

Authors:  S Michie; M Johnston; C Abraham; R Lawton; D Parker; A Walker
Journal:  Qual Saf Health Care       Date:  2005-02

2.  Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings.

Authors:  Martin Eccles; Jeremy Grimshaw; Anne Walker; Marie Johnston; Nigel Pitts
Journal:  J Clin Epidemiol       Date:  2005-02       Impact factor: 6.437

3.  The OFF theory of research utilization.

Authors:  Andrew D Oxman; Atle Fretheim; Signe Flottorp
Journal:  J Clin Epidemiol       Date:  2005-02       Impact factor: 6.437

4.  Risks and benefits of estrogen plus progestin in healthy postmenopausal women: principal results From the Women's Health Initiative randomized controlled trial.

Authors:  Jacques E Rossouw; Garnet L Anderson; Ross L Prentice; Andrea Z LaCroix; Charles Kooperberg; Marcia L Stefanick; Rebecca D Jackson; Shirley A A Beresford; Barbara V Howard; Karen C Johnson; Jane Morley Kotchen; Judith Ockene
Journal:  JAMA       Date:  2002-07-17       Impact factor: 56.272

  4 in total
  39 in total

Review 1.  A thematic analysis of theoretical models for translational science in nursing: mapping the field.

Authors:  Sandra A Mitchell; Cheryl A Fisher; Clare E Hastings; Leanne B Silverman; Gwenyth R Wallen
Journal:  Nurs Outlook       Date:  2010 Nov-Dec       Impact factor: 3.250

2.  Implementation and spread of interventions into the multilevel context of routine practice and policy: implications for the cancer care continuum.

Authors:  Elizabeth M Yano; Lawrence W Green; Karen Glanz; John Z Ayanian; Brian S Mittman; Veronica Chollette; Lisa V Rubenstein
Journal:  J Natl Cancer Inst Monogr       Date:  2012-05

3.  The contribution of conceptual frameworks to knowledge translation interventions in physical therapy.

Authors:  Anne Hudon; Mathieu-Joël Gervais; Matthew Hunt
Journal:  Phys Ther       Date:  2014-07-24

Review 4.  A compilation of strategies for implementing clinical innovations in health and mental health.

Authors:  Byron J Powell; J Curtis McMillen; Enola K Proctor; Christopher R Carpenter; Richard T Griffey; Alicia C Bunger; Joseph E Glass; Jennifer L York
Journal:  Med Care Res Rev       Date:  2011-12-26       Impact factor: 3.929

5.  Methods to Improve the Selection and Tailoring of Implementation Strategies.

Authors:  Byron J Powell; Rinad S Beidas; Cara C Lewis; Gregory A Aarons; J Curtis McMillen; Enola K Proctor; David S Mandell
Journal:  J Behav Health Serv Res       Date:  2017-04       Impact factor: 1.505

6.  A critical synthesis of literature on the promoting action on research implementation in health services (PARIHS) framework.

Authors:  Christian D Helfrich; Laura J Damschroder; Hildi J Hagedorn; Ginger S Daggett; Anju Sahay; Mona Ritchie; Teresa Damush; Marylou Guihan; Philip M Ullrich; Cheryl B Stetler
Journal:  Implement Sci       Date:  2010-10-25       Impact factor: 7.327

7.  Key factors influencing adoption of an innovation in primary health care: a qualitative study based on implementation theory.

Authors:  Siw Carlfjord; Malou Lindberg; Preben Bendtsen; Per Nilsen; Agneta Andersson
Journal:  BMC Fam Pract       Date:  2010-08-23       Impact factor: 2.497

Review 8.  Development of a pharmacy practice intervention: lessons from the literature.

Authors:  Carmel M Hughes; Cathal A Cadogan; Cristín A Ryan
Journal:  Int J Clin Pharm       Date:  2015-08-22

9.  Effectiveness of medicines review with web-based pharmaceutical treatment algorithms in reducing potentially inappropriate prescribing in older people in primary care: a cluster randomized trial (OPTI-SCRIPT study protocol).

Authors:  Barbara Clyne; Marie C Bradley; Susan M Smith; Carmel M Hughes; Nicola Motterlini; Daniel Clear; Ronan McDonnell; David Williams; Tom Fahey
Journal:  Trials       Date:  2013-03-13       Impact factor: 2.279

10.  Ten principles of good interdisciplinary team work.

Authors:  Susan A Nancarrow; Andrew Booth; Steven Ariss; Tony Smith; Pam Enderby; Alison Roots
Journal:  Hum Resour Health       Date:  2013-05-10
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.