| Literature DB >> 34187595 |
Shoba Ramanadhan1, Anna C Revette2, Rebekka M Lee3, Emma L Aveling4.
Abstract
Qualitative methods are critical for implementation science as they generate opportunities to examine complexity and include a diversity of perspectives. However, it can be a challenge to identify the approach that will provide the best fit for achieving a given set of practice-driven research needs. After all, implementation scientists must find a balance between speed and rigor, reliance on existing frameworks and new discoveries, and inclusion of insider and outsider perspectives. This paper offers guidance on taking a pragmatic approach to analysis, which entails strategically combining and borrowing from established qualitative approaches to meet a study's needs, typically with guidance from an existing framework and with explicit research and practice change goals.Section 1 offers a series of practical questions to guide the development of a pragmatic analytic approach. These include examining the balance of inductive and deductive procedures, the extent to which insider or outsider perspectives are privileged, study requirements related to data and products that support scientific advancement and practice change, and strategic resource allocation. This is followed by an introduction to three approaches commonly considered for implementation science projects: grounded theory, framework analysis, and interpretive phenomenological analysis, highlighting core analytic procedures that may be borrowed for a pragmatic approach. Section 2 addresses opportunities to ensure and communicate rigor of pragmatic analytic approaches. Section 3 provides an illustrative example from the team's work, highlighting how a pragmatic analytic approach was designed and executed and the diversity of research and practice products generated.As qualitative inquiry gains prominence in implementation science, it is critical to take advantage of qualitative methods' diversity and flexibility. This paper furthers the conversation regarding how to strategically mix and match components of established qualitative approaches to meet the analytic needs of implementation science projects, thereby supporting high-impact research and improved opportunities to create practice change.Entities:
Keywords: Analysis; Implementation science; Methods; Practice-based; Pragmatic; Qualitative
Year: 2021 PMID: 34187595 PMCID: PMC8243847 DOI: 10.1186/s43058-021-00174-1
Source DB: PubMed Journal: Implement Sci Commun ISSN: 2662-2211
Fig. 1Developing a pragmatic qualitative data analysis approach for IS: key considerations for selection of analytic procedures
Suggestions to ensure and communicate rigor in pragmatic qualitative analysis for IS
| Consideration | Description |
|---|---|
| Demonstrate the link between research goals, analytic approach, findings, and broader literature | Researchers should explain how and why they are incorporating procedures from different approaches. By explicitly justifying their decisions and connecting these pieces of the overall research design, the team can ensure internal coherence as they combine procedures from approaches that may have distinct underlying principles and assumptions. |
| Ensure transparency around data analysis | Researchers should provide sufficient details about which procedures from which analytic approaches have been used and how they were combined or adapted to enable readers and users of the research to understand and evaluate the utility of the work. Details may include, e.g., the initial coding structures and how conceptual frameworks influenced analysis. Additionally, for data collected among diverse participant groups (e.g., EBI recipients vs. implementers) or sites, details about if/how data were analyzed separately and then holistically are critical. Ongoing documentation of the analytic process, including description of decision-making and mediation of disagreements, also supports transparent reporting. |
| Triangulate data | The analysis can be strengthened by comparing results from different methods of inquiry (e.g., participant observation and focus group discussions) or different sources (e.g., implementers and leaders) to gain a more comprehensive and nuanced view of the IS concerns at hand. |
| Integrate reflexivity | The researchers should describe how their background, experience, and positions (particularly in terms of being grounded in research or practice) may influence their analysis of the data. Relevant details may include experience with the implementation effort, setting, implementers, and EBI of interest. |
| Use member reflections | Sharing early findings with members of participant groups to get feedback offers an opportunity to strengthen the analysis and help meet practice goals. This could include sharing early interpretations with an advisory group or key implementation stakeholders to gather suggestions to further refine/develop analyses. |
| Consider divergent cases | It is important to identify and investigate not only the broadly consistent themes but the deviant cases as well. This ensures a wide range of explanations have been considered, and the bulk of the cases have been included in the summaries offered. For example, this might prompt attention to an implementation site with a vastly different experience implementing a new innovation compared to others in its network. |