| Literature DB >> 32947900 |
Erin Lebow-Skelley1, Sarah Yelton2, Brandi Janssen3, Esther Erdei4, Melanie A Pearson1.
Abstract
Experts recommend reporting environmental exposure results back to research participants and communities, yet environmental health researchers need further guidance to improve the practice of reporting back. We present the results of a workshop developed to identify pertinent issues and areas for action in reporting back environmental health research results. Thirty-five attendees participated, brainstorming responses to the prompt: "What are some specific issues that are relevant to reporting back research results to individuals or the larger community?", and then grouping responses by similarity and rating their importance. Based on a combined theoretical foundation of grounded theory and qualitative content analysis, we used concept mapping to develop a collective understanding of the issues. Visual maps of the participants' responses were created using nonmetric multidimensional scaling and hierarchical cluster analysis. The resulting concept map provided a spatial depiction of five issue areas: Effective Communication Strategies, Community Knowledge and Concerns, Uncertainty, Empowering Action, and Institutional Review and Oversight (listed from highest to lowest rating). Through these efforts, we disentangled the complex issues affecting how and whether environmental health research results are reported back to participants and communities, by identifying five distinct themes to guide recommendations and action. Engaging community partners in the process of reporting back emerged as a unifying global theme, which could improve how researchers report back research results by understanding community context to develop effective communication methods and address uncertainty, the ability to act, and institutional concerns about beneficence and justice.Entities:
Keywords: community engagement; concept mapping; environmental health; research report-back
Mesh:
Year: 2020 PMID: 32947900 PMCID: PMC7557638 DOI: 10.3390/ijerph17186742
Source DB: PubMed Journal: Int J Environ Res Public Health ISSN: 1660-4601 Impact factor: 3.390
Concept Mapping Steps and their Application with the Partnerships for Environmental Public Health (PEPH) Community.
| Steps | Step Components | 2018 PEPH Workshop |
|---|---|---|
| Step 1 | Selecting Participants | All 2018 PEPH Annual Meeting attendees invited to participate. |
| Step 2 | Brainstorming | Workshop participants collectively brainstormed. |
| Step 3 | Sorting Statements | Workshop participants independently sorted and rated online during workshop |
| Step 4 | Creation of Maps | HERCULES staff used Group Concept Mapping software to create maps during workshop |
| Step 5 | Statement List | Workshop participants selected ideal cluster solution during workshop, gave input on cluster labels. |
| Step 6 | For Planning (e.g., action plans, needs assessment) | Subset of workshop participants developed summaries of results and recommendations to improve report-back among the environmental health community, reported here. |
Adapted from “An introduction to concept mapping for planning and evaluation” [23].
Figure 1Report-back Cluster Rating Map. The Cluster Rating Map depicts the five report-back themes identified and selected by workshop participants, including points and statement numbers for each statement that makes up the cluster (listed in Table 1). Statements that were frequently sorted together (seen by participants as conceptually related) are placed closer to each other. The average importance rating of each cluster is illustrated by the layers of each cluster, with more layers indicating a higher average rating.
Concept Mapping Clusters, Statements, Rating, and Bridging.
| Statement Number | Statement | Average Rating a | Bridging |
|---|---|---|---|
|
|
|
|
|
| 5 | Making sure the information is understandable | 4.95 | 0.1 |
| 28 | What language to deliver it in | 4.55 | 0.08 |
| 27 | Communicating the appropriate level of concern | 4.35 | 0.27 |
| 19 | What medium to use/how to deliver it | 4.2 | 0.08 |
| 11 | Defining scientific measurement/terms | 4 | 0.05 |
| 7 | How to represent it visually | 3.95 | 0 |
| 8 | Medical and environmental health literacy | 3.65 | 0.33 |
|
|
|
|
|
| 3 | Including community input on report-back process | 4.86 | 0.45 |
| 4 | Ensuring community concerns are reflected in the report-back | 4.76 | 0.42 |
| 26 | Using cultural competence | 4.6 | 0.41 |
| 16 | Being able to reach people for report-back | 4.4 | 0.64 |
| 20 | Undervaluing community knowledge | 3.6 | 0.04 |
| 14 | Bias against community members from academics | 3.55 | 0.11 |
| 18 | Cognitive dissonance between researchers and community | 3.4 | 0.1 |
| 33 | Results may not be satisfactory to the community | 3.4 | 0.39 |
| 9 | Assumption that community doesn’t understand | 3.2 | 0.05 |
|
|
|
|
|
| 12 | Deciding what to report | 4.3 | 0.28 |
| 15 | Being able to talk about uncertainty | 3.9 | 0.25 |
| 31 | Outlining what factors/sources are contributing to the results | 3.75 | 0.33 |
| 24 | Differentiating between research results and diagnosis (sub-clinical results) | 3.74 | 0.55 |
| 13 | Not having a standard for comparison | 3.55 | 0.46 |
|
|
|
|
|
| 30 | The ability to act given socio-economic disparities | 4.25 | 0.3 |
| 29 | What kind of recommendations can we make | 4.15 | 0.26 |
| 10 | What do they do with it | 4 | 0.3 |
| 25 | Can the information be used to solve the problem | 3.9 | 0.21 |
| 1 | How to include clinical recommendations when appropriate | 3.29 | 0.45 |
| 6 | Engaging medical care providers | 3.05 | 0.89 |
|
|
|
|
|
| 32 | Getting IRB approval to do report-back | 4.2 | 1 |
| 22 | Concerns about telling them what to do/what not to do | 3.7 | 0.33 |
| 17 | Composition of the IRB (community representative) | 3.4 | 0.79 |
| 21 | Unanticipated negative consequences beyond consented individual | 3.2 | 0.42 |
| 2 | Tension in scientific community around right to know vs. not doing harm | 2.9 | 0.51 |
| 23 | Managing media | 2.85 | 0.67 |
a Rating on a scale of 1–5, with 5 being the highest.
Figure 2Cluster Rating Map with Highly Bridged Statements. Arrows come from statements with bridging values above the cluster mean and point towards the cluster that they were frequently sorted with. The weight of the arrow indicates the bridging value, with heavier lines indicating a statement that was more frequently sorted with other clusters, i.e., spanned to other areas of the map because participants conceptualized them as interrelated with other themes.
Figure 3Recommendations to Improve the Practice of Reporting Back Research Results. To improve the practice of reporting back, researchers and academic institutions may consider taking action in these five areas.