Literature DB >> 33043527

Just follow the science: A government response to a pandemic.

Mathew Mercuri PhD1,2.   

Abstract

Entities:  

Year:  2020        PMID: 33043527      PMCID: PMC7675691          DOI: 10.1111/jep.13491

Source DB:  PubMed          Journal:  J Eval Clin Pract        ISSN: 1356-1294            Impact factor:   2.431


× No keyword cloud information.
In response to questions about policy decisions related to the coronavirus, government officials around the world have invoked the importance of science. Their decisions, we are told, would “follow the science,” or something similar (eg, be “based on,” “led by,” “guided by”). As a practicing scientist, I would appear remiss to suggest that such a practice is not a good thing. I do not wish to downplay the value of science or promote an anti‐science agenda. However, a decision by government to emphasize the importance of science in policy is not without concerns. For one, it raises pragmatic (and perhaps philosophical) questions about what gets included as “science” and what it means to make a decision that is based on science. Perhaps an equally important question is “why?” That is, by what basis should science be given special attention with respect to decisions on how we live and social policy? An examination of such questions might provide insight into the role of science in policy and why it has now become so important to government. From the beginning of the pandemic, the government appeal to “science” has been seemingly restricted to health fields, in particular virology, immunology, clinical medicine, epidemiology, and public health. The special privilege given to health fields might be seen in the daily press briefings, where a member of the scientific team producing the science (usually a model of the expected virus spread in a population; more on that later) in question might be called upon to comment or stand beside the government officials as they engage the media and public. Seemingly absent from the conversation and press briefings were (and are) experts and data from other fields, such as sociology, behavioural science, and economics. The absence of such does not entail a lack of importance in government decision making on pandemic response, although the role of those disciplines is less clear—it much easier to understand the role of epidemiology studies in the decision to lockdown the population compared to that from economics or sociology, especially when data from the former is explicitly cited and members from that community are given visibility. An appeal to the health sciences might seem appropriate given the pandemic can be situated as ultimately a health issue, but doing so does have other implications, which I will briefly touch upon later. Occasionally, the public discourse might cite another field of science, such as physics (eg, dynamics of aerosols), although standards of evidence might still rest on those of the favoured fields unless it becomes politically untenable to do so. For example, consider the issue of face coverings as a means to mitigate virus transmission in public. Early on, mechanistic reasoning supported by laboratory studies or physical sciences, evidence from observational studies on influenza, or small studies and/or non‐randomized studies on COVID‐19 (see , ) were not considered an adequate basis for policy decisions on wide use of face coverings in public spaces. Rather, when government and public health officials were claiming such a strategy was not needed or that there was little data to support the use of face coverings outside of health care settings, they were appealing to a dearth of clinical trials and inconclusive evidence from systematic reviews (both staples of “good science” in the clinical medicine evidence base). Only when the political climate changed (and/or when personal protective equipment supplies became more secure), it would seem, did we see a shift in an appeal to evidence derived from basic science or from “less rigorous” sources (those often shunned by practitioners of evidence‐based medicine). We were often told that the “science was evolving,” but it seems that is more a rhetorical tactic to preserve the narrative that decisions are driven by science than a fact of the science itself. Models describing potential spread of the virus and the expected impact of mitigation strategies received much attention from governments and media in the early months of the pandemic. Such models appear to have been driving the government response in several jurisdictions, especially with respect to initiating lockdown policies in order to curb expected case growth and preserve hospital surge capacity. Models contain several elements that lay people associate with science—they were heavy on math, produced by academics, published in peer‐reviewed journals, and included several graphs and references to the literature. In short, they look like science. However, those elements do not make something science. While philosophers of science have long debated on what makes something science, such models fail on several other criteria that are often associated with science. For example, presented models are not (and perhaps, cannot be) tested against experience in a controlled manner. Without such controls or testing against experience, it is difficult to hold models accountable. Accountability, in the form of testability, falsifiability, or independent assessment/reproduction of results, for example, is a commonly viewed characteristic of science. Models are theoretical. I suspect many would consider theoretical work to be scientific work. However, the purpose of theoretical work, for example in physics, is to lay a foundation for so‐called “laws of nature” that can be tested in empirical study. The assumptions that scaffold pandemic models are often known to be dynamic and susceptible to all kinds of difficult to predict social forces, unlike assumptions in physics that refer to what, to some extent, are thought to be stabile features of the natural world. Furthermore, unlike physics, presented pandemic models seem to be developed for reasons of decision making and not to support efforts to test in empirical study. Pandemic models are not scientific findings; they are ideas about what might happen under various scenarios or conditions. If “following the science” refers to basing policy on models, then government policy is only science based insofar as presented models are considered science. Modelling may be an instantiation of scientific reasoning, and thus, “following the science.” Where models become problematic in driving policy is when the chosen assumptions and outcomes do not align with the needs of the population, its values, and/or, with respect to what I am discussing in this paper, the public's perception of science and its membership. Consider two highly visible models: one produced by researchers at Imperial College, and another by those at the University of Toronto. Both models are heavily reliant on several assumptions related to properties of the virus and population dynamics (primarily based on limited empirical findings from clinical studies), and both focus on virus related cases, hospitalizations, and deaths as outcomes. Putting aside questions about the validity of the assumptions, such models clearly consider one perspective, that is, what impact selected strategies will have on the virus directly. Absent is the impact of those strategies on non‐virus related health and healthcare access. Also absent is the impact of those strategies on other aspects of society, such as the economy, social services, food security, social inequities, cultural practices, social life, the environment—presumably issues important to the public, are matters of public policy, and several of which are the target of scientific inquiry. Coming back to the issue of the assumptions, several scientific disciplines outside of clinical medicine and health sciences can play a role in helping us determine the validity of claims in and stemming from the model. For example, sociology and behavioural psychology can inform on how people interact in a pandemic; physics can inform on how face coverings or air exchange technologies impact formation and movement of aerosols. Is it fair to say one is “following the science” if only a small group of scientists or disciplines is included in the conversation? Again, the prominence of these models in government and media discourse on the pandemic does not entail that data from studies produced by other disciplines was not important in the decision making process by government officials. Certainly, the decision to ease lockdown measures were not driven by the Imperial College and University of Toronto models that both suggest a negative impact on the chosen outcomes with reduced measures. However, the decision to ease lockdown measures, for example, could be made without any consideration given to the science (which may or may not exist for several important outcomes), nor is it clear to me how any science played a role or how various sources of data derived from different scientific disciplines were integrated when making policy. In contrast, it is much clearer how the decision to close schools and implement lockdowns were based on such models. If the science was being “followed” by government officials in making policy about how to respond to the pandemic, it was seemingly, for the most part, one kind of science and from one part of the science community. Let us for a moment put aside the appropriateness of the claim that decisions “followed the science.” Paul Feyerabend, a prominent 20th century philosopher of science, once asked, “should the sciences be given the run of our educational institutions and of society as a whole or should they be treated like any other special interest group?” (p127) Certainly, a society is free to make policy decisions using any criteria its members choose. A decision that follows or is based on science does not entail a good decision or one that is better than what could be decided using something other than science. For example, it would be arrogant to suggest that pre‐science governments did not make good policy decisions for their society. It is perhaps ironic that several of the interventions we have used to respond to the virus (and which we believe are good decisions), such as quarantine and closing borders, were implemented by governments in response to disease outbreaks in a time well before (and irrespective of) the practice of science as we know it today. What is it that makes science special such that governments are now so careful to point out its importance in decision making? I could think of a few reasons. For one, a lay view of science is that it is objective and apolitical; that it is concerned with the facts and not with values. On that view, an appeal to science would empty policy decisions of subjectivity and party values. An appeal to science also has the political benefit of delegating responsibility to the science community, thus freeing government officials (they hope!) of accountability (ie, “it is not our fault, the science was bad”). Unfortunately, such views overlook the subjective and political nature of science (and facts). For example, what a scientist decides to measure and how are both inherently subjective and framed by her values and experiences (both as a scientist and as a member of the public), as are observations and the interpretation of results, how a report is framed, where to publish, etc. A practical concern that might undermine the utility of the appeal to science is that scientists engage in studies that focus on technical questions of their field, which may or may not be relevant to or comprehensive of the issue at hand for the government. As Baruch Fischhoff, a highly respected researcher on risk and decision making, suggests, “scientists do not normally address decision makers' needs directly” (p140) The politician (or the expert delegated with such responsibility) must bridge the gap between what the studies show and how that relates to the issue or warrants the policy to be implemented. Science is subjective and political, and so is the judgement used in translating scientific findings to practice in dealing with real world problems and the decision to base policy on findings from scientific studies. Equally important is a decision to “follow the science” might overlook the importance of other considerations in decision making that are not within the purview of science (eg, cultural imperatives or religious values) or those that are not or have yet to be informed by scientific study. Science is a powerful tool and that which is derived from scientific study certainly has value in informing policy on how to respond to the current pandemic. However, as Nickson et al suggest in their report regarding the UK government's early response to the coronavirus pandemic, “Science advice should inform, not make, policy” (p7) Science does not have all the answers, and what answers it has produced regarding this pandemic are often preliminary and/or indirect with respect to the target policy. That is not a fault of science—there are few historical precedents where the nature of a phenomenon was learned so quickly. When governments claim they are “following the science” it should be made clear how they are doing so and what are the limitations. A failure to do so can potentially lead to an erosion of trust in science, in particular when the science is portrayed (intentionally or unintentionally) by officials as objective or definitive, such as was often the case when governments (and in many cases, scientists themselves when engaging media) projected infection rates under various scenarios based on models. A simple extrapolation of research findings is rarely warranted due to a lack of fidelity between the study conditions and the real world. As such, policy decisions are complex activities, where judgement and deliberation play a key role. Thus, we might be better to embrace the fact that human judgement is involved in any policy decision‐making process than to rely on vague appeals to science. Making such judgement explicit when engaging the public could go far to engender trust, both in the political process and in science. We should also be vigilant to avoid letting the scientific evidence that does exist overwhelm other considerations that may warrant a change in policy or let one group of scientists dictate the discussion. As Feyerabend claims, “the sciences do not have the last word in humane matters, knowledge included” (p127). In a democratic society, the last word on such matters ought to be that of the people.
  9 in total

1.  Making Decisions in a COVID-19 World.

Authors:  Baruch Fischhoff
Journal:  JAMA       Date:  2020-07-14       Impact factor: 56.272

2.  Mathematical modelling of COVID-19 transmission and mitigation strategies in the population of Ontario, Canada.

Authors:  Ashleigh R Tuite; David N Fisman; Amy L Greer
Journal:  CMAJ       Date:  2020-04-08       Impact factor: 8.262

Review 3.  Community Use Of Face Masks And COVID-19: Evidence From A Natural Experiment Of State Mandates In The US.

Authors:  Wei Lyu; George L Wehby
Journal:  Health Aff (Millwood)       Date:  2020-06-16       Impact factor: 6.301

4.  Face masks for the public during the covid-19 crisis.

Authors:  Trisha Greenhalgh; Manuel B Schmid; Thomas Czypionka; Dirk Bassler; Laurence Gruer
Journal:  BMJ       Date:  2020-04-09

5.  Physical distancing, face masks, and eye protection to prevent person-to-person transmission of SARS-CoV-2 and COVID-19: a systematic review and meta-analysis.

Authors:  Derek K Chu; Elie A Akl; Stephanie Duda; Karla Solo; Sally Yaacoub; Holger J Schünemann
Journal:  Lancet       Date:  2020-06-01       Impact factor: 79.321

6.  Cloth Masks May Prevent Transmission of COVID-19: An Evidence-Based, Risk-Based Approach.

Authors:  Catherine M Clase; Edouard L Fu; Meera Joseph; Rupert C L Beale; Myrna B Dolovich; Meg Jardine; Johannes F E Mann; Roberto Pecoits-Filho; Wolfgang C Winkelmayer; Juan J Carrero
Journal:  Ann Intern Med       Date:  2020-05-22       Impact factor: 25.391

7.  Lessons from the history of quarantine, from plague to influenza A.

Authors:  Eugenia Tognotti
Journal:  Emerg Infect Dis       Date:  2013-02       Impact factor: 6.883

8.  COVID-19 policy measures-Advocating for the inclusion of the social determinants of health in modelling and decision making.

Authors:  J Cristian Rangel; Sudit Ranade; Penny Sutcliffe; Eric Mykhalovskiy; Denise Gastaldo; Joan Eakin
Journal:  J Eval Clin Pract       Date:  2020-06-21       Impact factor: 2.336

9.  Face coverings for the public: Laying straw men to rest.

Authors:  Trisha Greenhalgh
Journal:  J Eval Clin Pract       Date:  2020-05-26       Impact factor: 2.336

  9 in total
  7 in total

1.  The brave new world of pandemic resilience.

Authors:  Mathew Mercuri; Brian Baigrie
Journal:  J Eval Clin Pract       Date:  2022-02-14       Impact factor: 2.336

2.  How Good is the Science That Informs Government Policy? A Lesson From the U.K.'s Response to 2020 CoV-2 Outbreak.

Authors:  Jessica Cooper; Neofytos Dimitriou; Ognjen Arandjelovíc
Journal:  J Bioeth Inq       Date:  2021-10-14       Impact factor: 2.216

3.  Follow *the* science? On the marginal role of the social sciences in the COVID-19 pandemic.

Authors:  Simon Lohse; Stefano Canali
Journal:  Eur J Philos Sci       Date:  2021-10-22       Impact factor: 1.602

4.  From "getting things right" to "getting things right now": Developing COVID-19 guidance under time pressure and knowledge uncertainty.

Authors:  Marjolein Moleman; Fergus Macbeth; Sietse Wieringa; Frode Forland; Beth Shaw; Teun Zuiderent-Jerak
Journal:  J Eval Clin Pract       Date:  2021-10-06       Impact factor: 2.336

5.  Scientific publication speed and retractions of COVID-19 pandemic original articles.

Authors:  Luisa Schonhaut; Italo Costa-Roldan; Ilan Oppenheimer; Vicente Pizarro; Dareen Han; Franco Díaz
Journal:  Rev Panam Salud Publica       Date:  2022-04-12

6.  A pandemic of nonsense.

Authors:  Mathew Mercuri
Journal:  J Eval Clin Pract       Date:  2022-08-04       Impact factor: 2.336

7.  The politics of the surgical mask: Challenging the biomedical episteme during a pandemic.

Authors:  Shane Neilson
Journal:  J Eval Clin Pract       Date:  2021-06-16       Impact factor: 2.336

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.