| Literature DB >> 32245481 |
Jouni T Tuomisto1, Mikko V Pohjola2,3, Teemu J Rintala2,4.
Abstract
BACKGROUND: Evidence-informed decision-making and better use of scientific information in societal decisions has been an area of development for decades but is still topical. Decision support work can be viewed from the perspective of information collection, synthesis and flow between decision-makers, experts and stakeholders. Open policy practice is a coherent set of methods for such work. It has been developed and utilised mostly in Finnish and European contexts.Entities:
Keywords: Collaboration; Decision support; Environmental health; Evaluation; Impact assessment; Knowledge crystal; Open assessment; Open policy practice; Policy-making; Shared understanding
Mesh:
Year: 2020 PMID: 32245481 PMCID: PMC7118856 DOI: 10.1186/s12961-020-00547-3
Source DB: PubMed Journal: Health Res Policy Syst ISSN: 1478-4505
Principles of open policy practice (Collaboration, Openness, Causality, Criticism, Intentionality principles)
| Principle | Description |
|---|---|
| Collaboration | Knowledge work is performed together in aim to produce shared information. |
| Openness | All work and information are openly available for reading and contributing to anyone interested at all times. If there are exceptions, these must be publicly justified. |
| Causality | The focus is on understanding and describing the causal relations between the decision options and the intended outcomes. The aim is to predict what impacts will likely occur if a particular decision option is chosen. |
| Criticism | All information presented can be criticised based on relevance and accordance to observations. The aim is to reject ideas, hypotheses — and ultimately decision options — that do not hold against critique. |
| Intentionality | The decision-makers explicate their objectives and decision options under consideration. Additionally, values of other participants or stakeholders are documented and considered. |
Fig. 1Information flows in open policy practice. Open assessments and web-workspaces have an important role as information hubs. They collect relevant information for particular decision processes and organise and synthesise it into useful formats especially for decision-makers but also for anyone. The information hub works more effectively if all stakeholders contribute to one place or alternatively facilitators collect their contributions there
Fig. 2The three parts of open policy practice. The timeline goes roughly from left to right, but all work should be seen as iterative processes. Shared understanding as the main output is in the middle, expert-driven information production is a part of execution. Evaluation and management gives guidance to the execution
Test of shared understanding
| Question | Who is asked? |
|---|---|
| Is all relevant and important information described? | All participants of the decision processes (including knowledge gathering processes) |
| Are all relevant and important value judgements described? (Those of all participants, not just decision-makers) | |
| Are the decision-maker’s decision criteria described? | |
| Is the decision-maker’s rationale from the criteria to the decision described? |
Properties of good policy support. Here, ‘assessment’ can be viewed as a particular expert work producing a report about a specific question, or as a wider description of shared understanding about a whole policy process; assessment work is done before, during and after the actual decision
| Category | Description | Guiding questions | Related principles |
|---|---|---|---|
| Quality of content | Specificity, exactness and correctness of information; correspondence between questions and answers | How exact and specific are the ideas in the assessment? How completely does the (expected) answer address the assessment question? Are all important aspects addressed? Is there something unnecessary? | Openness, causality, criticism |
| Applicability | How well does the assessment address the intended needs of the users? Is the assessment question good in relation to the purpose of the assessment? | Collaboration, openness, criticism, intentionality | |
| Is the information provided by the assessment available when, where and to whom is needed? | Openness | ||
| Are the intended users able to understand what the assessment is about? Is the assessment useful for them? | Collaboration, openness, causality, intentionality | ||
| Is the assessment (both its expected results and the way the assessment is planned to be made) acceptable to the intended users? | Collaboration, openness, criticism, intentionality | ||
| Efficiency | Resource expenditure of producing the assessment output either in one assessment or in a series of assessments | How much effort is needed for making the assessment? Is it worth spending the effort, considering the expected results and their applicability for the intended users? Are the assessment results useful for some other purpose? | Collaboration, openness |
Important settings for environmental health and other impact assessments within the context public policy-making
| Attribute | Guiding questions | Example categories |
|---|---|---|
| Impacts | • Which impacts are addressed in assessment? • Which impacts are the most significant? • Which impacts are the most relevant for decision-making? | Environment, health, cost, equity |
| Causes | • Which causes of impacts are recognised in assessment? • Which causes of impacts are the most significant? • Which causes of impacts are the most relevant for decision-making? | Production, consumption, transport, heating, power production, everyday life |
| Problem owner | • Who has the interest, responsibility and/or means to assess the issue? • Who actually conducts the assessment? • Who has the interest, responsibility and/or power to make decisions and take actions upon the issue? • Who is affected by the impacts? | Policy-maker, industry, business, expert, consumer, public |
| Target users | • Who is the intended users of assessment results? • Who needs the assessment results? • Who can make use of the assessment results? | Policy-maker, industry, business, expert, consumer, public |
| Interaction | • What is the degree of openness in assessment (and management)? (see Table • How does assessment interact with the intended use of its results? (see Table • How does assessment interact with other actors in its context? | Isolated, informing, participatory, joint, shared |
Dimensions of openness in decision-making
| Dimension | Description |
|---|---|
| Scope of participation | Who is allowed to participate in the process? |
| Access to information | What information about the issue is made available to participants? |
| Timing of openness | When are participants invited or allowed to participate? |
| Scope of contribution | Which aspects of the issue are participants invited or allowed to contribute to? |
| Impact of contribution | How much are participant contributions allowed to have influence on the outcomes? How much weight is given to participant contributions? |
Categories of interaction within the knowledge–policy interaction framework
| Category | Description |
|---|---|
| Isolated | Assessment and use of assessment results are strictly separated; results are provided for intended use, but users and stakeholders cannot interfere with the making of the assessment |
| Informing | Assessments are designed and conducted according to specified needs of intended use; users and limited groups of stakeholders may have a minor role in providing information to the assessment, but mainly serve as recipients of assessment results |
| Participatory | Broader inclusion of participants is emphasised; participation is, however, treated as an add-on alongside the actual processes of assessment and/or use of assessment results |
| Joint | Involvement and exchange of summary-level information among multiple actors is emphasised in scoping, management, communication and follow-up of assessment; on the level of assessment practice, actions by different actors in different roles (assessor, manager, stakeholder) remain separate |
| Shared | Different actors engage in open collaboration upon determining assessment questions, seeking answers to them, and implementing answers in practice; however, the actors involved in an assessment retain their roles and responsibilities |
Fig. 3Insight network about dioxins, Baltic fish and health as described in the BONUS GOHERR project [31]. Decisions are shown as red rectangles, decision-makers and stakeholders as yellow hexagons, decision objectives as yellow diamonds, and substantive issues as blue nodes. The relations are written on the diagram as predicates of sentences where the subject is at the tail of the arrow and the object is at the tip of the arrow. For other insight networks, see Additional file 1: Appendix S2
The ‘attributes’ of a knowledge crystal
| Attribute | Description |
|---|---|
| Name | An identifier for the knowledge crystal; each page has a permanent, unique name and identifier or URL |
| Question | A research question that is to be answered; it defines the scope of the knowledge crystal Assessments have specific sub-attributes for questions (see section |
| Answer | An understandable and useful answer to the question; it is the current best synthesis of all available data; typically, it has a descriptive easy-to-read summary and a detailed quantitative ‘result’ published as open data; an answer may contain several competing hypotheses, if they all hold against scientific critique; in this way, it may include an accurate description of the uncertainty of the answer, often in a probabilistic way |
| Rationale | Any information that is necessary to convince a critical rational reader that the answer is credible and usable; it presents to a reader the information required to derive the answer and explains how it is formed; it may have different sub-attributes depending on the page type, some examples are listed below • Data tell about direct observations (or expert judgements) about the topic. • Dependencies tell what is known about how upstream knowledge crystals (i.e. causal parents) affect the answer; dependencies may describe functional or probabilistic relationships; in an insight network, dependencies are described as arrows pointing toward the knowledge crystal • Calculations are an operationalisation of how to calculate or derive the answer; it uses algebra, computer code, or other explicit methods if possible • Discussions are structured or unstructured discussions about the details of the substance, or about the production of substantive information; on a wiki, discussions are typically located on the talk page of the substance page |
| Other | In addition to attributes, it is practical to have clarifying subheadings on a knowledge crystal page; these include self-explanatory subheadings such as See also, Keywords, References, Related files |
Fig. 4System architecture of the Climate Watch web-workspace
Methods evaluated based on properties of good policy support
| Method | Quality of content | Relevance | Availability | Usability | Acceptability | Efficiency |
|---|---|---|---|---|---|---|
| Co-creation | + Participants bring new info (2, 3, 25, 26) | + Additional file 1: New questions are identified during collaborative work (6, 11) | + Draft results raise awareness during work (2, 8, 27) | ? Readers ask clarifying questions and learn and create understanding through collaboration | + Participants are committed to conclusions (2, 8, 27) | ? Collaboration integrates communication to decision-makers and stakeholders (users) into the making, which saves time and effort |
| Open assessment | + It combines functionalities of other methods and enables peer-reviewed assessment models (4, 5, 16) | + End-user discussions improve assessment (16, 26, 27) | ? It is available as draft since the beginning | + Standard structure facilitates use (8, 9) | + Openness was praised (3, 8, 9, 21) | + Scope can be widened incrementally (12–16) |
| Insight network | + It brings structure to assessment and helps causal reasoning (8, 9, 11, 16, 17) | + It helps and clarifies discussions between decision-makers and experts (8, 9) | - | ? Readers see what is excluded | ? It helps to check whether important issues are missing | - |
| Knowledge crystal | + They streamline work and provide tools for quantitative assessments (e.g. 3, 23, 24) | + They clarify questions (1, 6) | ? It is mostly easy to see where information should be found | ? Summaries help to understand | ? They make the intentionality visible by describing the assessment question | + Answers can be reused across assessments (12–16, 23–24) |
| Web-workspace | + Its structure supports high-quality content production when moderated (8, 9) | + It combines user needs and open policy practice (8, 9) | + It offers an easy approach to and archive of materials (16, 21, 23, 26) | + The user needs guided the functions developed (8) | - | ? It offers a place to document shared understanding and distribute information broadly |
| Structured discussion | + It helps to moderate discussion and discourages low-quality contributions (2, 30) | + It guides focus on important topics (16, 30) | - | ? Threads help to focus reading | + User feedback has been positive: it helps to focus on key issues (8, 30) | ? Structure discourages redundancy |
| Open policy ontology | - | + It gives structure to insight networks and structured discussions (8, 16, 30) | - | ? Ontology clarifies issues and relations | - | - |
| Value profile and archetype | - | + Value profiles help to prioritise (8) | - | ? Voting advice applications may offer an example | ? Stakeholders’ values are better heard | ? Archetypes are effective summaries |
| Paradigm | ? It motivates clear reasoning | ? It systematically describes conflicting reasonings | - | - | ? Stakeholders’ reasonings are better heard | ? It helps to analyse inferences of different groups |
| Analysis of destructive policies | - | + It widens the scope (3, 8) | - | ? It emphasises mistakes to be avoided | ? Focus is on everyone’s problems | ? Lessons learned can be reused in other decisions |
| Suggestions by open policy practice | Work openly, invite criticism; use tools and moderation to encourage high-quality contributions (Table | Acknowledge the need for and potential of co-creation, discussion, and revised scoping; invite all to policy-support work; characterize the setting (Table | Design processes and information to be open from the beginning; use open web-workspaces (Table | Invite participation from the problem owner and user groups early on; use user feedback to visualise, clarify and target content (Table | Be open; clarify reasoning; acknowledge disagreements; use the test of shared understanding (Table | Combine information production, synthesis and use to a co-creation process to save time and resources; use shared information objects with open license, e.g. knowledge crystals |
In each cell, actual benefit observed in open policy practice materials is marked with '+'. Potential benefit is marked with '?'. No anticipated benefit is marked with '-'. Numbers in parentheses refer to the assessments in Additional file 1: Appendix S1, Table S1-1. The last row contains general suggestions to improve policy support with respect to each property