| Literature DB >> 24028286 |
Monika Kastner1, Julie Makarski, Leigh Hayden, Lisa Durocher, Ananda Chatterjee, Melissa Brouwers, Onil Bhattacharyya.
Abstract
BACKGROUND: Realist reviews offer a rigorous method to analyze heterogeneous data emerging from multiple disciplines as a means to develop new concepts, understand the relationships between them, and identify the evidentiary base underpinning them. However, emerging synthesis methods such as the Realist Review are not well operationalized and may be difficult for the novice researcher to grasp. The objective of this paper is to describe the development of an analytic process to organize and synthesize data from a realist review.Entities:
Mesh:
Year: 2013 PMID: 24028286 PMCID: PMC3848005 DOI: 10.1186/1471-2288-13-112
Source DB: PubMed Journal: BMC Med Res Methodol ISSN: 1471-2288 Impact factor: 4.615
Figure 1Flow of data analysis process.
Operationalization of the categorization process using the “LANGUAGE” domain as an example
| 1. Group attributes that are antonyms | • Complex/Simple | |
| 2. Group attributes that are synonyms | • Unclear/Confusing | |
| 3. Group attributes with the same root | • Specific/Specificity | |
| • Validity/Valid | ||
| 4. Sort database by attribute | ||
| 5. Are there commonalities among attributes? | The following attributes can be grouped into a category called “Clarity” | |
| • Unambiguous | ||
| • Precise | ||
| 6. Is there a central theme or focus among groups of attributes? | • Specific | |
| 7. Do the attributes belong within the same cluster? | The following categories can be collapsed: | |
| 8. Can they be collapsed? | • “Complexity” with “Information overload” | |
| 9. Use attribute definitions to make these decisions | ||
| • “Actionability” (e.g., using active voice) with “Wording” | ||
| 10. Based on their included attributes and definitions, define and label the cluster | The LANGUAGE domain can be defined as: | |
Final list of attribute categories across 5 domains of guideline implementability
| Ambiguity, Specificity, Vagueness | Language | |
| Congruity, Fluency, Schema | ||
| Complexity, Options, Difficult to understand | ||
| Concision, Embedded propositions | ||
| Relative advantage, Gain-loss frame | Format | |
| Algorithm, Graphs, Tables | ||
| Elements | ||
| Accessibility, Computability | ||
| Visual imagery, Presentation | ||
| Arrangement, | ||
| Balance of benefits/harms, Dual viewpoint | Rigor of development | |
| Credible, Authoritative | ||
| Reliable, Reproducible, Explicitness | ||
| Evidence-based, Evidence-linked | ||
| Quality of evidence, Strength of evidence, Evidence grading | ||
| Validity, Up-to-date | ||
| Acceptability, Fit with decision-making, Perceived usefulness, Visibility | Feasibility | |
| Actionable, Executable, Operationalizable | ||
| Adaptability, Context, Tailoring | ||
| Feasibility, Compatibility, Costs, Resources | ||
| Implementability factors affecting feasibility, Trialability | ||
| Ease of use, Usefulness | ||
| Clinical relevance, Applicability | Decision-making | |
| Appropriateness, Value judgments | ||
| Flexibility, Clinical freedom | ||
| Patient involvement/communication/values | ||
| Beliefs, Compatibility, Values/Norms |
Suggested approach to organize, synthesize, validate and make sense of complex findings
| • Which method is the most appropriate to answer research questions? | • We searched the literature for various synthesis methods of complex evidence | • Potentially more valid if the method matches the question | • There was no single synthesis method that best fit our questions | • Need to adopt a flexible approach to match appropriate methods to answer research questions | |
| • Consider selecting a primary analysis method supplemented by other or modified methods to address all questions | |||||
| • How will the data be organized? | • We sorted and organized our data | • Sorting of concepts and themes on multiple levels | • Difficult to keep track of changes from multiple reviewers | • We used a modified duplicate review process that involved a group of second reviewers “auditing” the analysis of primary reviewers | |
| • Ensure that document tracking is transparent and efficient | |||||
| • Duplicate review is time consuming and resource intensive | |||||
| • Analysis process was done in duplicate | • Duplicate analysis minimizes bias | ||||
| • Will also depend on selected analysis method | |||||
| • How are you going to verify findings and minimize bias? | • Sought expert consensus on findings using survey methodology | • Survey methodology is quick and efficient | • Survey methodology has inherent biases | • Depending on resources, other consensus methods may increase validity such as the Delphi method | |
| • Transparency | |||||
| • How will the results and data be used? | • We developed a conceptual map of guideline implementability for guideline developers and end-users | • The conceptual map contributes to the understanding of guideline implementability | • There may be other factors not captured in the map that may influence guideline implementability | • The conceptual framework needs to be refined according to the codebook of definitions | |
| • The conceptual framework needs to be rigorously evaluated to determine the feasibility of its use by guideline developers, and its potential to influence guideline uptake by family physicians | |||||
| • The process advances the knowledge about analysis methods for complex evidence | |||||
| • Who are the target knowledge end users? | |||||
| • To what extent should the data be disseminated? | • The map will inform a guideline implementability framework for guideline developers, users and policy makers | • The framework will inform end-users about attributes that facilitate guideline uptake; and may also inform policy around guideline development | • There may be other factors influencing guideline implementability | • Prior to dissemination, the framework will need to undergo rigorous evaluation | |
| • Will the work inform practice, system, policy? |