| Literature DB >> 30004862 |
Peg Allen1, Jean C O'Connor2,3,4, Leslie A Best3, Meenakshi Lakshman5, Rebekah R Jacob5, Ross C Brownson5,6.
Abstract
BACKGROUND: Research shows that training can improve skills needed for evidence-based decision making, but less is known about instituting organizational supports to build capacity for evidence-based chronic disease prevention. COMMUNITY CONTEXT: The objectives of this case study were to assess facilitators and challenges of applying management practices to support evidence-based decision making in chronic disease prevention programs in the public health system in Georgia through key informant interviews and quantitatively test for changes in perceived management practices and skills through a pre-post survey.Entities:
Mesh:
Year: 2018 PMID: 30004862 PMCID: PMC6053922 DOI: 10.5888/pcd15.170482
Source DB: PubMed Journal: Prev Chronic Dis ISSN: 1545-1151 Impact factor: 2.830
Figure 1Organizational framework of the Chronic Disease Prevention Section, Georgia Department of Public Health, 2017. Abbreviations: BRFSS, Behavioral Risk Factor Surveillance System; YRBS, Youth Risk Behavior Survey; ATS, Adult Tobacco Survey; YTS, Youth Tobacco Survey.
Figure 2Framework for training in public health evidence-based decision making. Source: Brownson et al (1).
Timeline of Steps to Support and Assess Evidence-Based Chronic Disease Prevention in Georgia, 2013–2017
| Date | Chronic Disease Prevention Section Activity | PRC-St. Louis EBDM Training, Support, and Assessment |
|---|---|---|
| Fall 2013 | CDPS director hired to move CDPS toward coordinated chronic disease approach | — |
| Spring 2014 | CDPS reorganization launched to promote coordinated approach | CDPS enrolled into EBDM study |
| June–July 2014 | — | Baseline pretraining survey of CDPS staff members and partners conducted by R.R.J. |
| July 2014 | Strategic Direction for Chronic Disease Prevention: 2014–2019 published by GDPH | — |
| August 18–21, 2014 | EBDM 3½-day training provided in Atlanta with CDPS staff and GDPH epidemiologists | August 2014 EBDM training provided by R.C.B. and other course faculty |
| September 2014 | EBDM training attendees provided input for next steps selected by CDPS management team | PRC-St. Louis–provided Qualtrics survey for prioritization input on steps brainstormed at August 2014 training |
| October 2014 | Staff meetings reorganized to incorporate EBDM and sharing of information across programs | Monthly collaborative calls of PRC-St. Louis and CDPS started for encouragement and support |
| November 2014 | Statewide Chronic Disease Council advisory body of 24 leaders from diverse sectors launched by CDPS | — |
| November 2014 and January 2015 | Summary presentations created by staff in each CDPS program for program review, revision, and communication to partners | Review of programs by L.A.B., P.A., R.R.J. |
| January 2015 | CDPS commitment to follow science and EBDM processes, including incorporation of EBDM in CDPS manager and program staff annual performance plans | — |
| January–May 2015 | 20 new CDPS staff members, many with MPH degrees and PhDs, hired and brought on board | — |
| May 2015 | CDPS annual meeting held with local health district chronic disease managers | — |
| May 2015 to Dec 2016 | Statewide health assessment and health improvement plan led by CDPS as part of GDPH’s accreditation preparations | Using Data Sources for Public Health Practice supplementary 2-session webinar training provided to CDPS, other GDPH staff |
| June 2015 | CDPS website relaunch completed, with posting of chronic disease data and program information, logic models, evaluation plans and reports, and links to resources to enhance partner access to information for evidence-based public health practice | — |
| August 2015 | STAR site visit with the National Association of Chronic Disease Directors took place, with feedback to identify strengths and ways CDPS could improve | — |
| Fall 2015 | A plan for how to institute STAR recommendations developed by CDPS leadership team | — |
| July 2015 | Chronic Disease University monthly webinar series launched with J.C.O.’s EBDM overview | R.C.B. and P.A. contributed slides to J.C.O.’s EBDM introductory overview to the series |
| April–May 2016 | — | Post-training survey of CDPS staff and partners conducted by P.A. and M.L. |
| May 2016 | CDPS annual meeting held with local health district chronic disease managers | — |
| May–June 2016 | — | 11 post-training interviews conducted by L.A.B. and P.A. |
| August 2016 | Retreat held by CDPS leadership team to identify continuing implementation actions | — |
| Winter 2016–Spring 2017 | CDPS participated in a STAR follow-up site visit and identified follow-up items, including staff survey; CDPS facilitated GDPH enrollment in the Public Health Digital Library, providing all staff with journal access via the GDPH intranet | Quantitative data management, analyses by P.A.; qualitative coding, analyses by P.A., M.L. |
| May 2017 | CDPS annual meeting with local health district chronic disease managers; CDPS staff member wins the departmental award for excellence in science | — |
| September 2017 | All staff surveyed on organizational culture and opportunities for improvement; CDPS held an all-staff strategic planning retreat to evaluate progress; CDPS leadership team held a retreat to identify continuing implementation actions | — |
Abbreviations: CDPS, Chronic Disease Prevention Section; EBDM, evidence-based decision making; GDPH, Georgia Department of Public Health; PRC-St. Louis, Prevention Research Center in St. Louis, Brown School, Washington University in St. Louis; STAR, State Technical Review and Assistance Program provided by the National Association of Chronic Disease Directors on behalf of Centers for Disease Control and Prevention.
Descriptions of Management Practices Instituted to Support Evidence-Based Decision Making Provided by 5 Interviews of CDPS Staff Members, 2016
| Domain | Management Practice | Sample of Quotes That Describe the Management Practice | Additional Comments |
|---|---|---|---|
| Leadership support in CDPS | Role modeled EBDM | Bringing together the art and science has raised our awareness of how evidence contributes to strengthening our programs and our collective work and that is something very different that the chronic disease director has done. | — |
| Emphasized and expected EBDM | I think EBDM started with the expectation that it was part of your job, your job description, and part of your goals that you needed to meet every year. I think that was probably the most effective way to support EBDM. | Expectations communicated through repetition both verbally and through an internal review process of program plans. | |
| Supported and protected staff | And protect them and support them when they are doing the right thing, but are being given a hard time by other people . . . addressing the safety issues for people about making some of these changes. | — | |
| Incorporated EBDM in staff meetings | I would often see staff in meetings with staff from a different unit, talking about a program or initiative, and that was the biggest change, seeing that interaction . . . . working together on an initiative or an evaluation. | — | |
| Provided supportive tools | Providing “monthly webinars” [with posted slides], “making fact sheets and putting it up on our website, providing them [partners] with those materials. | Posted chronic disease indicators and data source descriptions on section webpages. Collaborated with a university library to increase access to full-text journal articles. | |
| Restructuring of CDPS | Restructured section by function | It has forced programs and forced people who have similar risk factors to deal with from a disease category to work together . . . so our current configuration has broken the silos. | — |
| Restructured programs | We have completely redone some of our programs. | Restructured programs to prioritize evidence-based policy, environmental, and systems approaches. | |
| Restructured program planning | Shifted how we do program planning here . . . now the first question is what does the Community Guide say . . . the literature. | Changed program planning processes to be evidence-driven and assigned evaluators to help with planning. | |
| Workforce development | Hiring qualified staff | We are hiring, everyone has at least a master’s degree or an MPH. And people are coming with a lot of experience. | — |
| Job descriptions | All staff who have a programmatic job do have a science component to their job description . . . a requirement that they use the literature, the evidence, know how to cite and refer to those kinds of sources, and apply them in their work. | — | |
| Performance reviews | Performance management plan for the year . . . requires them to present or submit abstracts, for example, to conferences and so helps to promote the use of evidence-based processes . . . more than 20 abstracts were accepted and presented at different conferences. | — | |
| Chronic Disease Webinar Series | To showcase their work to other staff and external partners . . . a time for staff to talk about a facet of evidence-based public health and how it impacts their program. |
| |
| New employee orientation to EBDM | We made sure that all the staff that came on board after that training [EBDM] were exposed or given the opportunity to go and train. | EBDM course slides posted on staff intranet. Trained staff discussed course and shared materials, and a few new staff attended the national Brownson EBDM course. | |
| External training opportunities | That plan [annual employee performance management plan] also, for most of our programmatic staff, requires to present or submit abstracts, for example, to conferences, and so helps to promote the use of EBDM processes. | Staff encouraged to attend external trainings. If staff present at a conference, they can then attend the full conference as part of their continued learning. | |
| Evaluation training series | To explain what evaluation is and how it should be married to program development and how we should be evaluating in our partnerships. | Program logic models, evaluation plans posted as well. | |
| Organizational climate | Acceptance of EBDM expectations | [After initial mixed views], the culture was that people accepted it. They understood it was something we all had heard about in grad school, we had been introduced to it, but now it was time to practice it. | — |
| A pull to and away from EBDM | We do have a little bit of tension in that our CDC-funded work really requires evidence-based approaches . . . and then on the other hand we get requests . . .that aren’t evidence-based. | — | |
| Relationships and partnerships | Participatory decision making | We present them [funded partners] with a list of options . . . so we have discussions about that . . . and it can go back and forth for some time . . .we try not to dictate what it is that they have to do. | — |
| Financial practices | Performance-based contracting | For any contract that we’re going to put out there for a program . . . the outcomes and objectives have to be based in the evidence. | — |
| Transparency | [CDPS] has made a great effort to be as transparent as [CDPS director] can be with budget issues, with programming issues, and I really appreciate that . . . as transparent as possible with the requests for proposals, the timelines, the timeframes that we need to get things done. | — |
Abbreviations: CDC, Centers for Disease Control and Prevention; CDPS, Chronic Disease Prevention Section of the Georgia Department of Public Health; EBDM, evidence-based decision making.
Changes From Baseline to Post-Training in Skill Gaps in Evidence-Based Decision Making, Use of Research Evidence, and Organizational Supports, CDPS Staff Members (n = 30) and Staff Members From Partnering Organizations (n = 44), 2014–2016a
| Survey Item | CDPS Staff (n = 30) | Partners | ||||
|---|---|---|---|---|---|---|
| Baseline Mean (SD) | Post-Training Mean (SD) |
| Baseline Mean (SD) | Post-Training Mean (SD) |
| |
|
| ||||||
| Prioritization | 2.2 (2.4) | 1.0 (1.2) | .01 | 1.2 (2.0) | 0.9 (1.3) | .41 |
| Adapting interventions | 2.4 (2.3) | 1.1 (1.9) | .005 | 1.4 (2.4) | 1.4 (2.4) | .99 |
| Quantifying the issue | 1.1 (2.1) | 1.1 (2.3) | .94 | 0.9 (2.2) | 0.8 (2.2) | .82 |
| Evaluation designs | 1.6 (2.4) | 0.9 (1.7) | .23 | 1.7 (2.2) | 1.0 (2.0) | .13 |
| Quantitative evaluation | 1.3 (1.8) | 0.7 (1.9) | .10 | 1.2 (1.8) | 1.3 (1.9) | .69 |
| Qualitative evaluation | 1.8 (2.6) | 0.7 (1.9) | .04 | 1.6 (2.7) | 1.4 (2.1) | .74 |
| Economic evaluation | 3.3 (3.4) | 3.2 (2.8) | .96 | 2.0 (2.2) | 2.3 (3.0) | .56 |
| Action planning | 1.5 (1.9) | 0.6 (1.0) | .01 | 0.9 (1.5) | 0.9 (1.4) | .95 |
| Community assessment | 1.7 (1.6) | 1.1 (1.2) | .15 | 1.0 (1.6) | 1.6 (2.3) | .16 |
| Communicating research to policymakers | 1.9 (2.6) | 1.6 (2.1) | .57 | 1.7 (2.0) | 1.7 (2.3) | .87 |
| Overall (10-item sum) | 19.0 (17.1) | 11.7 (11.9) | .02 | 13.8 (15.7) | 13.5 (15.0) | .94 |
|
| ||||||
| Write a grant application | 2.7 (0.6) | 2.8 (0.5) | .77 | 2.6 (0.6) | 2.7 (0.5) | .79 |
| Plan or conduct a needs assessment | 2.6 (0.6) | 2.5 (0.6) | .34 | 2.8 (0.4) | 2.6 (0.6) | .14 |
| Select an intervention | 2.6 (0.6) | 2.7 (0.4) | .20 | 2.8 (0.4) | 2.8 (0.4) | .71 |
| Justify intervention selection to funders and leadership | 2.4 (0.8) | 2.9 (0.2) | .002 | 2.9 (0.4) | 2.7 (0.5) | .29 |
| Evaluate interventions | 2.7 (0.6) | 2.6 (0.7) | .54 | 2.8 (0.4) | 2.6 (0.5) | .03 |
| Develop materials for partners | 2.5 (0.7) | 2.7 (0.5) | .21 | 2.8 (0.4) | 2.8 (0.4) | .53 |
|
| ||||||
| My direct supervisor expects me to use EBDM | 5.4 (1.4) | 6.1 (1.1) | .006 | 5.5 (1.4) | 5.4 (1.4) | .74 |
| My direct supervisor recognizes the value of management practices that facilitate EBDM | 5.5 (1.5) | 5.9 (1.2) | .04 | 5.8 (1.2) | 5.3 (1.3) | .01 |
| My performance is partially evaluated on how well I use EBDM in my work | 4.1 (1.7) | 4.3 (1.7) | .52 | 4.9 (1.4) | 4.4 (1.4) | .04 |
| My work unit has access to current research evidence for EBDM | 5.1 (1.6) | 5.6 (1.4) | .18 | 5.7 (1.2) | 5.8 (1.0) | .71 |
| My work unit has the resources (eg, staff, facilities, partners) to support application of EBDM | 4.5 (1.5) | 5.2 (1.4) | .01 | 4.7 (1.7) | 4.7 (1.6) | .94 |
| The staff in my work unit has the necessary skills to carry out EBDM | 4.9 (1.6) | 5.6 (1.2) | .04 | 5.2 (1.5) | 5.2 (1.3) | .99 |
| Information is widely shared in my work unit for decision making | 4.9 (2.0) | 5.3 (1.6) | .12 | 5.6 (1.2) | 5.6 (1.1) | .91 |
| My work unit distributes intervention evaluation findings to other organizations | 5.0 (1.9) | 5.6 (1.6) | .09 | 5.4 (1.4) | 5.6 (1.4) | .58 |
| My agency is committed to hiring people with relevant training in core disciplines in public health | 5.0 (1.8) | 5.7 (1.5) | .01 | 5.4 (1.6) | 5.4 (1.2) | .99 |
Abbreviation: CDPS, Chronic Disease Prevention Section of the GDPH; GDPH, Georgia Department of Public Health; EBDM, evidence-based decision making.
Baseline survey was conducted before training in June and July 2014; of 124 potential participants (30 CDPS staff members and 94 partners) invited by email, 105 completed the baseline survey (84.7% response). Of those who completed baseline survey, 74 (70.5%) completed the post-training survey in April and May 2016.
From other GDPH sections, district and local public health offices, universities, voluntary health agencies, community-based organizations, and other state agencies.
Calculated as the score for the perceived importance of the skill in the work unit minus the score for the perceived availability of resources for applying the skill in the work unit. Both importance and availability were scored on an 11-point Likert scale (from 0 = not important to 10 = very important and 0 = not available to 10 = very available). The question on the survey was, “Now, we would appreciate your help rating the importance and availability of each skill in the statements below. First, read the statements (skills in EBDM) below; then, use the first scale to rate the importance of each of the skills to you. Next, use the second scale to rate how available each skill is to you when you need it (either in your own skill set or among others in your agency).”
Participants provided responses on a 4-point Likert scale: 1 = seldom or never, 2 = sometimes, 3 = often, 4 = always.
Participants provided responses on a 7-point Likert scale: 1 = strongly disagree to 7 = strongly agree.