| Literature DB >> 26295924 |
Dmitry Khodyakov1, Terrance D Savitsky1, Siddhartha Dalal1.
Abstract
BACKGROUND: Public and stakeholder engagement can improve the quality of both research and policy decision making. However, such engagement poses significant methodological challenges in terms of collecting and analysing input from large, diverse groups.Entities:
Keywords: CBPR; collaborative learning; modified delphi; online panel; research priorities; stakeholder engagement
Mesh:
Year: 2015 PMID: 26295924 PMCID: PMC5049448 DOI: 10.1111/hex.12383
Source DB: PubMed Journal: Health Expect ISSN: 1369-6513 Impact factor: 3.377
Figure 1The Distribution of participants by stakeholder group and cluster. This figure describes a stakeholder composition of six clusters.
Figure 2Average articulation by cluster. This figure presents 95% confidence intervals for articulation, v, averaged over participants in each cluster, estimated from our Bayesian (nonparametric) model, and listed separately for Round 1 and 3 to visually depict change in the levels of articulation. The bolded horizontal line in each boxplot represents the posterior mean value for each cluster articulation.
Figure 3Average group agreement by cluster. This figure presents posterior mean value for the Round 3 variance in participant beliefs within each cluster minus the Round 1 variance among such participants for each combination of suicide prevention research goal and rating criterion. A smaller variance among participants indicates a relatively greater agreement among participants as compared to a larger variance. Each bar in a plot panel represents the difference in variance among raters for a particular goal criterion, and each chart panel includes all goals for raters in a cluster. A negative value for a bar within a cluster plot panel indicates a by‐round shift among stakeholders towards agreement. The longer the bar, the larger the shift to agreement is.
Measures of engagement with the online stakeholder engagement process by clusters
| Cluster 1 | Cluster 2 | Cluster 3 | Cluster 4 | Cluster 5 | Cluster 6 | |
|---|---|---|---|---|---|---|
| The discussion brought out divergent views | 5.2 | 4.6 | 4.6 | 4.7 | 4.5 | 4.3 |
| Participants debated each others’ viewpoints during the discussions | 4.4 | 3.4 | 4.2 | 3.9 | 3.9 | 3.7 |
| Participation in this exercise was interesting | 5.8 | 5.3 | 5.3 | 5.3 | 5.5 | 4.9 |
| The survey instrument was easy to use | 4.7 | 4.8 | 4.8 | 4.7 | 4.6 | 3.8 |
Data presented in this table represent the profiles for statistically significant estimated latent clusters. Mean values of satisfaction survey responses of stakeholders who belong to the same cluster are reported. All statements were rated on a 7‐point agreement scale, where 1 = Strongly Disagree, 4 = Neutral and 7 = Strongly Agree. For example, compared to all other clusters, Cluster 1 members have the most positive attitude towards the first three statements describing their participation in the ExpertLens process because they have the highest average ratings to these statements.
Typology of collaborative learning
| Group agreement | Individual articulation | |
|---|---|---|
| Increased | No change/decreased | |
| Increased |
Learning towards consensus |
Groupthink |
| Little‐to‐no change/decreased |
Learning by contrast | No learning |