| Literature DB >> 28219958 |
Dorie E Apollonio1, Lisa A Bero2.
Abstract
INTRODUCTION: Researchers advocating for evidence-informed policy have attempted to encourage policymakers to develop a greater understanding of research and researchers to develop a better understanding of the policymaking process. Our aim was to apply findings drawn from studies of the policymaking process, specifically the theory of policy windows, to identify strategies used to integrate evidence into policymaking and points in the policymaking process where evidence was more or less relevant.Entities:
Keywords: QUALITATIVE RESEARCH
Mesh:
Year: 2017 PMID: 28219958 PMCID: PMC5337675 DOI: 10.1136/bmjopen-2016-012738
Source DB: PubMed Journal: BMJ Open ISSN: 2044-6055 Impact factor: 2.692
Research findings were not relevant to the problem definition stream
| Legislators viewed problem definition as constituent driven | People will look at the evidence and say “Okay, that's all very fine, but my constituents want [this].” (P#04, legislator) |
| My experience has been is that these decisions [about what is important] are seldom made on a study or review of the studies that's out there, on evidence that may be provided. It's based on a much simpler approach of gut or something someone read in an article or something like that. (P#15, legislator) | |
| I think [legislators] understand the concept of [evidence-informed policy], but they don't think that should be the only criterion. And the other criteria are their own personal beliefs, their own personal experience, something they heard about, what their father said worked one time, you know that kind of thing. So you always have to encounter that. (P#21, legislator/administrator) | |
| Administrators also perceive problem definition as constituent driven | Legislators say to me “I deal with one constituent at a time…” The people I deal with don't care about numbers. (P#02, administrator) |
| What I've come to find is that what really appeals to the legislators is anecdotal stuff. Passion. Rarely do I get a request from a legislative aid that asks, “What's the data on this?” Never do we get the question, “What's the research on this?” It's all about gut level feeling. That's what sells things in our legislature. (P#13, administrator) |
Appropriately presented research findings could be relevant in the policy alternatives stream
| Stories are used to make research compelling | It's important for people who use evidence, to understand that |
| Now it helps to have, in addition to your statistical evidence | |
| I think most legislators are reasonable people if you can try to relate to them and get them to understand “this could be me”. (P#14, administrator) | |
| The health department would always come to me with… the research. And I always had to tell them, “Look you understand, if some jerk in the legislature has one anecdote that goes directly against this I could lose it.” The anecdote that tells you what happened to a person, only a person is just very powerful. So I used to make them go look for anecdotes on their side. The researcher thought I was a nut. | |
| It's good to combine things. You know, you take data that's good data and then you back it up with a human face on it. Because then you've got the logic and compassion going for you as a part of the argument. And I think that combination is powerful. (P#17, legislator) | |
| Numbers are not persuasive | [Legislators] tend not to want to do numbers… when I testify I watch the eyes; use numbers, and they glaze. Because if you think about the background of most [legislators], they're not science people, they're mostly non-science, non-mathematicians, non-engineers. And so when I talk about five parts per billion, they have no concept. If you say one grain of sand on the beach at Waikiki, they kind of get it. (P#02, administrator) |
| Most legislators don't understand cause and correlation. They don't have any clue about statistics. (P#03, legislator) | |
| Simplified study assessment guidelines can guide decision making | [S]ometimes we answer fire with fire. We say “That's a great article, it's a great subject, we think that we would love to research this topic, or see more data and evidence on this topic when you get it in a peer reviewed journal, in a controlled study.” Sometimes we get it out of the press. Reporters typically ask “Doctor so and so says he's doing this study.” [We say that] |
| When you talk to [certain advocacy groups] generally their sources are themselves. That's when you know that they've cooked the data. We went through some [training that said] | |
| We've had to continually go back to, you know and each meeting we go through another set of interventions that people have come up with. I mean these families… come up with these studies where there's like five kids. They'll come to the meeting, here's this study and then we go through it. It's like, “Okay, how many children? | |
| What you've got to find is the inconsistencies in that article and rip it right in front of their eyes. That's the only thing you can do. In the past at least no matter what the administration you'd be able to go back and say “Look in this area the CDC says this is the best way to do it.” And people would just shut up at that point. But now it's harder [and] you've got to be able to attack stupid research that isn't research. (P#16, legislator) | |
| One of our standard responses when a company comes and asks us to cover something, | |
| Using research can make some policy alternatives more credible | They may say “Okay, this is a policy we want to adopt, but we want to bounce this off of somebody that really knows how to analyse and find evidence.” So they can say “All right, here are the reviews that we looked at, here's the policy that we're articulating. What are the weak points? What level of confidence can we have if we move forward with this?” (P#18, administrator) |
| [W]hen you look at a study, and | |
Scientific research findings were desired but inadequate when seeking to influence the political feasibility stream
| Lack of relevant, timely studies prevented research findings from being useful | The problem is when they're making decisions like that they're making them under a gun, meaning something has happened or something is happening like a budget crisis. They want quick results and you don't get quick results trying to transform a system and base it on evidence. (P#15, legislator) |
| The next challenge is really having the needs of policymakers driving the research. (P#18, administrator) | |
| [Existing systematic reviews] really can't be tied to any policy issue that you can find. They're driven by either funding from pharmaceutical companies, or an investigator's whim of what's interesting. They're not tied to a political or policy question that has import and that we need evidence for. (P#09, administrator) | |
| The departments of health in particular, states frequently buy into and pay for non-evidence-based treatment programmes. Somebody's got some small programme in some state or in some town that they believe anecdotally has been eminently successful. [They] sell that programme, when in fact the studies are so small, and so poorly documented, that looking at successes is absolutely, purely anecdotal. It has no scientific basis whatsoever, and yet states are buying into those kinds of programme because there are no good studies that are conducted. (P#11, legislator) |