| Literature DB >> 16818262 |
Abstract
The histories of selected public and environmental hazards, from the first scientifically based early warnings about potential harm to the subsequent precautionary and preventive measures, have been reviewed by the European Environment Agency. This article relates the "late lessons" from these early warnings to the current debates on the application of the precautionary principle to the hazards posed by endocrine-disrupting substances (EDSs). Here, I summarize some of the definitional and interpretative issues that arise. These issues include the contingent nature of knowledge; the definitions of precaution, prevention, risk, uncertainty, and ignorance; the use of differential levels of proof; and the nature and main direction of the methodological and cultural biases within the environmental health sciences. It is argued that scientific methods need to reflect better the realities of multicausality, mixtures, timing of dose, and system dynamics, which characterize the exposures and impacts of EDSs. This improved science could provide a more robust basis for the wider and wise use of the precautionary principle in the assessment and management of the threats posed by EDSs. The evaluation of such scientific evidence requires assessments that also account for multicausal reality. Two of the often used, and sometimes misused, Bradford Hill "criteria," consistency and temporality, are critically reviewed in light of multicausality, thereby illustrating the need to review all of the criteria in light of 40 years of progress in science and policymaking.Entities:
Mesh:
Substances:
Year: 2006 PMID: 16818262 PMCID: PMC1874173 DOI: 10.1289/ehp.8134
Source DB: PubMed Journal: Environ Health Perspect ISSN: 0091-6765 Impact factor: 9.031
Figure 1Knowing and not knowing: a dynamic expansion.
Toward a clarification of key terms.
| Situation | State and dates of knowledge | Examples of action |
|---|---|---|
| Risk | Known impacts; known probabilities, e.g., asbestos | Prevention: action taken to reduce known hazards, e.g., eliminate exposure to asbestos dust |
| Uncertainty | Known impacts; unknown probabilities, e.g., antibiotics in animal feed and associated human resistance to those antibiotics | Precautionary prevention: action taken to reduce exposure to potential hazards |
| Ignorance | Unknown impacts and therefore unknown probabilities, e.g., the surprise of chlorofluorcarbons (CFCs), pre-1974. | Precaution: action taken to anticipate, identify, and reduce the impact of surprises |
From “Late Lessons” (EEA 2001).
Different levels of proof for different purposes: some examples and illustrations.
| Probability (%) | Quantitative descriptor | Qualitative descriptor | Illustrations |
|---|---|---|---|
| 100 | Very likely (90–99%) | Statistical significance | Part of strong scientific evidence for causation |
| Beyond all reasonable doubt | Most criminal law and the Swedish Chemical Law 1973 ( | ||
| Likely (66–90%) | Reasonable certainty | ||
| Sufficient scientific evidence | To justify a trade restriction designed to protect human, animal, or plant health under World Trade Organisation (WTO) Sanitary and Phytosanitary (SPS) Agreement, Article 2.2, ( | ||
| Medium likelihood (33–66%) | Balance of evidence | ||
| Balance of probabilities | Much civil and some administrative law | ||
| Reasonable grounds for concern | European Commission communication on the precautionary principle ( | ||
| Strong possibility | British Nuclear Fuels occupational radiation compensation scheme (20–50% probabilities triggering different awards ≥50%, which then triggers full compensation) ( | ||
| Low likelihood (10–33%) | Scientific suspicion of risk | Swedish Chemical Law 1973 ( | |
| Available pertinent information | To justify a provisional trade restriction under WTO SPS Agreement, Article 5.7, where scientific information is insufficient ( | ||
| Very unlikely (1–10%) | Low risk | Household fire insurance | |
| 0 | Negligible and insignificant |
Probability bands based on IPCC (2001).
Figure 2A framework for risk analysis and hazard control.
Twelve late lessons.
| Identify/clarify the framing and assumptions |
| 1. Manage risk, uncertainty, and ignorance |
| 2. Identify/reduce “blind spots” in the science |
| 3. Assess/account for all pros and cons of action/inaction |
| 4. Analyze/evaluate alternative options |
| 5. Take account of stakeholder values |
| 6. Avoid “paralysis by analysis” by acting to reduce hazards via the precautionary principle. |
| Broaden assessment information |
| 7. Identify/reduce interdisciplinary obstacles to learning |
| 8. Identify/reduce institutional obstacles to learning |
| 9. Use lay, local, and specialist knowledge |
| 10. Identify/anticipate real world conditions |
| 11. Ensure regulatory and informational independence |
| 12. Use more long-term (i.e., decades long) monitoring and research |
Figure 3CFCs: skin cancer and time lags (EEA 2001).
On being wrong: environmental and health sciences and their directions of error.
| Scientific studies | Some methodological features | Main |
|---|---|---|
| Experimental studies (animal laboratory) | High doses | False positive |
| Short (in biological terms) range of doses | False negative | |
| Low genetic variability | False negative | |
| Few exposures to mixtures | False negative | |
| Few fetal–lifetime exposures | False negative | |
| High fertility strains | False negative (developmental/reproductive end points) | |
| Observational studies (wildlife and humans) | Confounders | False positive |
| Inappropriate controls | False positive/negative | |
| Nondifferential exposure misclassification | False negative | |
| Inadequate followup | False negative | |
| Lost cases | False negative | |
| Simple models that do not reflect complexity | False negative | |
| Both experimental and observational studies | Publication bias toward positives | False positive |
| Scientific/cultural pressure to avoid false positives | False negative | |
| Low statistical power (e.g., from small studies) | False negative | |
| Use of 5% probability level to minimize chances of false positives | False negative |
Some features can go either way (e.g., inappropriate controls), but most of the features err mainly in the direction shown in this table.