| Literature DB >> 33795362 |
James F Phillips1, Bruce B MacLeod2, S Patrick Kachur3.
Abstract
Implementation research often fails to have its intended impact on what programs actually do. Embedding research within target organizational systems represents an effective response to this problem. However, contradictions associated with the approach often prevent its application. We present case studies of the application of embedded implementation research in Bangladesh, Ghana, and Tanzania where initiatives to strengthen community-based health systems were conducted using the embedded science model. In 2 of the cases, implementation research standards that are typically embraced without question were abandoned to ensure pursuit of embedded science. In the third example, statistical rigor was sustained, but this feature of the design was inconsistent with embedded science. In general, rigorous statistical designs employ units of observation that are inconsistent with organizational units that managers can control. Structural contradictions impede host institution ownership of research processes and utilization of results. Moreover, principles of scientific protocol leadership are inconsistent with managerial leadership. These and other embedded implementation science attributes are reviewed together with contradictions that challenged their pursuit in each case. Based on strategies that were effectively applied to offsetting challenges, a process of merging research with management is proposed that is derived from computer science. Known as "agile science," this paradigm combines scientific rigor with management decision making. This agile embedded research approach is designed to sustain scientific rigor while optimizing the integration of learning into managerial decision making. © Phillips et al.Entities:
Mesh:
Year: 2021 PMID: 33795362 PMCID: PMC8087429 DOI: 10.9745/GHSP-D-20-00169
Source DB: PubMed Journal: Glob Health Sci Pract ISSN: 2169-575X
FIGURE 1Research Phases Associated With Developing Community-Based Primary Health Care in Bangladesh and Ghana
Adapted from Nyonator et al. and Awoonor-Williams et al.
Contradictions Associated With the Planning of Embedded Implementation Science and Case Study Examples of Strategies for Resolving Contradictions
| Attribute | Core Strategies of Implementation Science | Strategic Adjustments of Embedded Science | Contradictions Associated With Strategic Adjustments | Case Study Resolution of Contradictions |
|---|---|---|---|---|
| Goal | Problems are identified | Organizational change and development requires joint researcher and host agency goal setting. | Goals are defined in terms of endpoint hypotheses to be tested rather than host agency goals for testing means of achieving system change. | Retain, but subordinate, primary health and demographic impact research to implementation research as an integrated and continuous process. |
| Outcome evaluation | Statistical inference is based on observation of treatment and counterfactual endpoints, with units of observation conforming to power requirements. | Improved host agency functionality and impact | Protocols define project start and end dates, endpoints, and hypotheses, whereas organizational change is a continuous, open-ended, and multi-faceted process. | Phase in research as a process that fosters continuous utilization and action. |
| Leadership | Researchers in directive, independent, and autonomous roles with outreach to decision makers and managers at the end of investigation. | Collaboration of host agency and research partner leadership | Researchers assume directive, independent, and autonomous roles and episodically communicate health and demographic outcomes to host agency counterparts. | Host agency managers representing each level of the investigative process are appropriately teamed with research counterparts at each system level. |
| Ownership | Host agency audience through “steering committees” and end of the project dissemination. | Subordination of research leadership to host agency governance. | Leadership malaise in the host agency can permeate an embedded research system, diluting rigor and compromising research implementation. | Develop a partnership of research leadership with host agency institutional structures, but maintain an autonomous research operation. |
| Scientific rigor | Study designs conform to conventional criteria for statistical inference. | Studies embrace process research, mixed methods research designs, and multilevel analyses in concert with the norms of statistical inference. | Constructing the counterfactual is essential but inconsistent with management operations that span all organizational levels. | Intervene with treatment and counterfactual conditions that conform to the host organizational structure. |
| System relevance | Systems thinking provides frameworks for data capture and analysis. | Systems thinking includes partnership arrangements and research activities that reflect units of the host agency organization. | Contexts where implementation science is needed most are settings where systems research is most challenging to conduct. | Utilize replication studies to disperse research in all relevant cultural and ecological contexts. |
Contradictions Associated With Utilizing Embedded Implementation Science for Policy and Action
| Attribute | Core Strategies of Implementation Science | Strategic Adjustments of Embedded Science | Contradictions Encountered by Embedded Implementation Science | Implications for Resolving Contradictions |
|---|---|---|---|---|
| Curation of knowledge | Publish results and disseminate findings to host agency and research audiences. | Develop knowledge-sharing mechanisms. | Science is disseminated by modes of communication that have limited currency among donors, decision makers, implementers, and managers. | Develop a multimethod knowledge management system for research advocacy, |
| Sustainability | Recommend utilization of research findings in the course of end-of-project dissemination activities. | Collaboration of researchers and host agency counterparts on research utilization strategic planning. | Planning research utilization is challenged by the institutionalization of dysfunction. Failure is therefore more sustainable than improvement. | Utilize research phase 3 replication research to investigate the determinants of sustainability. |
Contradictions Associated With the Process of Conducting Embedded Implementation Science and Case Study Examples of Strategies for Resolving Contradictions
| Attribute | Core Strategies of Implementation Science | Strategic Adjustments of Embedded Science | Contradictions Encountered by Embedded Implementation Science | Implications for Resolving Contradictions |
|---|---|---|---|---|
| Teamwork | Constitute teams according to technical functions. | Delineate implementation and research teams. | Research teams and implementation teams have contrasting skills, orientations, and roles. | Configure at each level of the system “learning localities” where the pursuit of excellence is a collaborative endeavor that integrates implementation with investigation. |
| Simplicity | Develop measureable indicators of endpoints and possible confounders. | Focus on indicators that are commensurate with host organizational data capture, analysis, and communication capabilities. | Research and implementation integration is complex to undertake, but simplicity is often essential for fostering organizational change. | Employ mixed methods research and knowledge management to promote understanding of essential processes and outcomes. |
| Replicability | End of project terminates further research on replication or scale-up. | Design projects to facilitate subsequent replication and scale-up. | Developing learning systems requires focused inquiry in localities where interventions can be tractably managed. Managers often seek investigation that is immediately relevant to large-scale operations. | Plan phases in advance |
| Fidelity | Fidelity of interventions to themes appearing in the scientific literature. | For longitudinal research on scaling up, develop communication mechanisms that ensure widespread host agency understanding of the evidence justifying change. | Primary science generates knowledge about impact without providing knowledge about change processes. | Develop “learning localities” for catalyzing the geographic spread of implementation. |
Implications of Lessons From the Principles of Agile Science and Case Example for an Agile Paradigm for Embedded Implementation Research
| Attribute | The Agile Working Group's 12 Principles of Agile Science | Agile Embedded Science Implications |
|---|---|---|
| Goal | “Our highest priority is to satisfy the customer through early and continuous delivery of valuable software [program improvements].” | Problem identification is a continuous process. Owing to contextual complexity and uncertainty, problem details and solutions cannot always be identified in advance. |
| Outcome evaluation | “Continuous attention to technical excellence and good design enhances agility.” | Monitor compliance with implementation goals continuously with evaluation criteria that continuously shift, as needed. Subordinate demographic and health hypothesis testing to implementation process evaluation. |
| Leadership | “The best architectures, requirements, and designs [research strategies] emerge from self-organizing teams.” | Problem identification and candidate solutions can be defined by anyone in the research or host agency teams. Peer leadership is encouraged. Project leadership is systemic and multileveled, and it is the outcome of collaborative investigation of appropriate system development needs. |
| Ownership | “Business people [Host agency participants] and developers must work together daily throughout the project.” | Establish host agency and research joint ownership. Participatory decision making throughout the process of organizational development. |
| Scientific rigor | [Not relevant] | Develop credible results that focus on implementation processes and outcomes. |
| System relevance | Working software is the primary measure of progress | Achieve concordance of research operations with host agency structure and functions. Open-ended, iterative, and continuous sharing of information and review of progress. Timing of phases governed by host agency planning and decision processes. |
| Teamwork | “At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior [or strategies] accordingly.” | At regular intervals, program managers review feedback to implementers and researchers to detect departures from quality or the need to adjust research or implementation strategy. Roles are integrated for research and host agency counterparts by implementation function. |
| “Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done.” | Build teams around champions who are successful communicators of innovation. Foster peer leadership through exchanges. | |
| Simplicity | “Simplicity—the art of maximizing the amount of work not done—is essential.” | Simple solutions are preferred over more complex interventions. Complexity determined by host agency targeted changes to be investigated. |
| Replicability | “Welcome changing requirements, even late in development.” | Intervention targets, processes for monitoring, and evaluation procedures can be changed by evolving host agency priorities. |
| Fidelity | [Not relevant] | Intervention targets, processes for monitoring, and evaluation procedures can be changed by evolving host agency priorities. |
| Curation of knowledge | “The most efficient and effective method of conveying information to and within a [software] development team is face-to-face conversation.” | Direct communication between host agency and research team is essential. Integrate the process of generating evidence and outcomes with the process of utilizing evidence for decision making. |
| Sustainability | “Agile processes promote sustainable [software] development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely.” |
Research activities and processes are pursued at a pace that can be maintained indefinitely. Outcomes are delivered continuously as a regular part of research operations. Investigation is embedded in change processes that are continuous and never ending. |
Adapted from similar tables by Nerur et al. and by Flood et al.
FIGURE 2The Agile Science Process of Health Systems Strengthening