Literature DB >> 31348160

Assessing Organizational Supports for Evidence-Based Decision Making in Local Public Health Departments in the United States: Development and Psychometric Properties of a New Measure.

Stephanie Mazzucca1, Renee G Parks, Rachel G Tabak, Peg Allen, Maureen Dobbins, Katherine A Stamatakis, Ross C Brownson.   

Abstract

CONTEXT: Fostering evidence-based decision making (EBDM) within local public health departments and among local health department (LHD) practitioners is crucial for the successful translation of research into public health practice to prevent and control chronic disease.
OBJECTIVE: The purpose of this study was to identify organizational supports for EBDM within LHDs and determine psychometric properties of a measure of organizational supports for EBDM in LHDs.
DESIGN: Cross-sectional, observation study.
SETTING: Local public health departments in the United States. PARTICIPANTS: Local health department practitioners (N = 376) across the United States participated in the study. MAIN OUTCOME MEASURES: Local health department practitioners completed a survey containing 27 items about organizational supports for EBDM. Most items were adapted from previously developed surveys, and input from researchers and practitioners guided survey development. Confirmatory factor analysis was used to test and refine the psychometric properties of the measure.
RESULTS: The final solution included 6 factors of 22 items: awareness of EBDM (3 items), capacity for EBDM (7 items), resources availability (3 items), evaluation capacity (3 items), EBDM climate cultivation (3 items), and partnerships to support EBDM (3 items). This factor solution achieved acceptable fit (eg, Comparative Fit Index = 0.965). Logistic regression models showed positive relationships between the 6 factors and the number of evidence-based interventions delivered.
CONCLUSIONS: This study identified important organizational supports for EBDM within LHDs. Results of this study can be used to understand and enhance organizational processes and structures to support EBDM to improve LHD performance and population health. Strong measures are important for understanding how LHDs support EBDM, evaluating interventions to improve LHD capacity, and to guide programmatic and policy efforts within LHDs.

Entities:  

Year:  2019        PMID: 31348160      PMCID: PMC6614014          DOI: 10.1097/PHH.0000000000000952

Source DB:  PubMed          Journal:  J Public Health Manag Pract        ISSN: 1078-4659


Local health departments (LHDs) in the United States are critical to public health efforts focused on reducing the significant burden of chronic diseases and are responsible for implementing interventions to benefit the population's health. The 2800 LHDs across the country are well suited to support chronic disease reduction and prevention because they have a deep understanding of the local needs, context, and available resources within their communities.1,2 Local health department staff frequently deliver interventions directly to the community but also deliver them in collaboration with community partners using LHD staff and partner organization staff and volunteers. These partners work across health care (eg, hospitals or health care providers), nonprofit (eg, churches), government sectors (eg, parks and recreation departments), and private sectors (eg, worksites).3–5 Organizations such as the Institute of Medicine (now the National Academy of Medicine) have called for practitioners' efforts to be focused on implementing evidence-based interventions (EBIs) in their communities,6 which are defined broadly as programs, practices, processes, policies, and guidelines proven to be efficacious or effective.7 Resources have been developed to support the choice and implementation of EBIs (eg, the Community Guide8). However, there is a gap between the dissemination of EBIs and their implementation into public health practice,9 and efforts are needed to improve the uptake of EBIs. Evidence-based public health is an approach for improving population health that integrates research-tested interventions with community preferences10,11 that can be used by LHD practitioners to shrink the gap between research and practice.9,11,12 In a public health context, evidence refers to some type of data (eg, quantitative epidemiologic data, results from program or policy evaluations, and qualitative data) that are used to identify a problem, what should be done about the problem, how to implement the solution, and how to evaluate progress.10,13 A key piece of the evidence-based public health framework is evidence-based decision making (EBDM), defined as the process by which organizations choose and implement an EBI.14 Evidence-based decision making is characterized by several components: reviewing the best available peer-reviewed evidence, using data and information systems, applying program planning frameworks, engaging the community in assessment and decision making, conducting sound evaluation, disseminating findings to key stakeholders and decision makers, and synthesizing scientific and communication skills with common sense and political acumen as decisions are made.14 Many of these EBDM components are featured in national accreditation standards, illustrating their importance in the functioning and performance of a public health department.15 In addition, EBDM is closely aligned with the field of dissemination and implementation science, which aims to understand what processes and factors are associated with widespread use of an EBI and how EBIs are successfully integrated into usual practice in different settings (eg, community health clinics, LHDs).7 For EBDM to occur successfully, individual practitioners must have the required skills and abilities, for example, knowledge of EBIs or evaluation principles,16,17 which can be improved through training and capacity building at the individual practitioner level.18 Also, the organizations in which the practitioners work must be supportive of EBDM. Prior research has shown that organizations supportive of EBDM, for example, by dedicating financial resources for EBDM, demonstrate higher rates of EBI implementation and have higher agency performance (ie, the ability to carry out the 10 essential public health services, measured with setting-specific assessment instruments).19,20 Modifying organizational processes and capacity-building training efforts has the potential to promote uptake of EBDM and delivery of EBIs and improve agency performance.21–25 High-quality measures, such as those that are developed according to a theoretical model and empirically tested, are essential to understanding factors related to EBDM and efforts to implement EBDM in public health settings.26,27 Previously developed measures have focused on individual skills related to EBDM28 and organizational supports for EBDM within state health departments (SHDs).29 No measures for organizational supports have been rigorously validated for use in LHDs; 1 existing measure for use in LHDs has limited reliability evidence only.30 The nature of EBDM is likely to be different at the local versus the state level, thus organizational supports for EBDM may operate differently at LHDs.31–33 For example, the way that partnerships influence EBDM may be different at the local versus the state level, since LHD practitioners may act in more of an ongoing, collaborative nature with partners than SHD practitioners who direct funding to partners for evidence-based public health efforts. In addition, differences noted in the educational background of LHD practitioners, with fewer trained in public health compared with those in SHDs,32,33 may necessitate different organizational supports for EBDM. As such, the purpose of this study was to identify important organizational supports for EBDM based on a theoretically driven framework within LHDs and evaluate the psychometric properties of a measure of these organizational supports for EBDM, including relationships between organizational supports and delivery of EBIs, that can be used by LHDs across the United States. Public health practitioners and researchers could use this measure to guide the development and evaluation of efforts to increase individual and organizational capacity for EBDM within LHDs.

Methods

This cross-sectional study used data from an online survey completed by LHD practitioners in the United States. The survey was part of a larger study to improve evidence-based diabetes management and chronic disease prevention and control within LHDs.34 The study was reviewed and approved by the Institutional Review Board (IRB no. 201705026) of Washington University in St Louis.

Participant recruitment

Eligible LHDs were those that reported implementing either diabetes or body mass index screening or population-based nutrition or physical activity efforts in the 2016 National Association of County & City Health Officials (NACCHO) National Profile. Of those 1677 LHDs, 200 LHDs were randomly sampled from each of 3 jurisdiction population size categories (small: <50 000, medium: 50 000-199 999, and large: ≥200 000). A stratified sampling frame was used so that there would be adequate representation of medium and large LHDs, which make up about 27% and 16% of all LHDs, respectively, of the LHDs in the 2016 NACCHO National Profile. The lead practitioner working in chronic disease control at the LHD was invited to participate. After excluding nonvalid e-mail addresses, the final recruitment sample was 579.

Data collection

Data were collected with Qualtrics online survey software. Preinvitation e-mails were sent to the participants to inform them of the study purpose, and invitation e-mails with the study information and survey link were sent 1 week later. Those who had not completed the survey received up to 3 reminder e-mails and 2 phone calls over a 6-week period to encourage participation. The 376 (65% of invited sample) respondents were offered a $20 Amazon.com gift card for completing the survey.

Measures

Survey development was guided by a theoretical understanding of public health departments and built on prior studies. These studies reviewed administrative evidence-based practices in SHDs and LHDs,19 assessed barriers to EBDM25 and stages of organizational readiness for implementing EBIs in community chronic disease prevention settings,35 and developed measures of administrative evidence-based practices in LHDs30 and organizational supports for EBDM in SHDs.29 Other items were taken from instruments identified by the project team through snowball sampling.36,37 The survey development process has been detailed elsewhere,34 and full text of the survey items and response options used in this analysis is available in Supplemental Digital Content Appendix 1, available at http://links.lww.com/JPHMP/A559. Survey items were taken from prior surveys developed and used by the project team.19,25,29,30,35 Broadly, questions on the survey assessed use of EBIs, skills related to EBDM, and organizational supports for EBDM within LHDs. In addition to 3 rounds of input, cognitive response testing interviews with 10 practitioners similar to those in the target audience and an assessment of test-retest reliability were conducted. Items assessing organizational support factors related to EBDM were grouped into 6 categories on the survey, as shown in Supplemental Digital Content Appendix 1, available at http://links.lww.com/JPHMP/A559: awareness of EBDM (4 items), use of EBDM (7 items), resources available for maintaining EBDM (3 items), EBDM climate cultivation (4 items), evaluation capacity (5 items), and partnerships to support EBDM (4 items). The respondents were asked to indicate how much they agreed with each item on a 7-point Likert scale (1 = strongly disagree to 7 = strongly agree). The respondents reported characteristics about their LHD (eg, jurisdiction population size, current status in Public Health Accreditation Board [PHAB] accreditation efforts) and themselves (eg, age group, years in current position, Table 1). To quantify the number of EBIs offered by the LHD, the respondents were shown a list of 4 EBIs in 1 of 5 categories depending on the program area in which they worked (ie, diabetes, nutrition, physical activity, obesity, tobacco). Evidence-based interventions were taken from those identified in The Community Guide8 and What Works for Health38 (eg, Diabetes Prevention Program; worksite programs, policies, or environmental changes to support nutrition/healthy food and physical activity; and reminders for clinic health care providers to discuss tobacco/nicotine cessation with clients). During cognitive response testing, listed EBIs were reviewed by LHD practitioners to confirm that they were the most relevant set of EBIs for each program area. The respondents who reported working in a single program area were given interventions for that program area. Those who reported working in multiple program areas received the diabetes interventions if diabetes was selected as one of their program areas. If a respondent worked in more than one of these areas outside of diabetes, they received a randomly assigned set of interventions for one of their program areas.
TABLE 1

Sample Characteristics (N = 376)

n (%)
Respondent characteristics
Age group, y
 20-2914 (4)
 30-3986 (23)
 40-49111 (30)
 50-59107 (28)
 60+57 (15)
Race/ethnicitya
 White315 (84)
 Black/African American27 (7)
 Other races28 (7)
 Hispanic or Latino8 (2)
Gender
 Male60 (16)
 Female312 (83)
Master's degree or higher in any field
 No155 (42)
 Yes216 (58)
Public health master's or doctorate
 No253 (68)
 Yes118 (32)
Position
 Top executive, health director/officer/commissioner97 (26)
 Administrator, deputy or assist director77 (20)
 Manager of a division or program138 (37)
 Program coordinator33 (9)
 Technical expert position (evaluator, epidemiologist, health educator)/other30 (8)
Years in current position
 <5202 (54)
 5-987 (23)
 10-1960 (16)
 20+25 (7)
Years in public health
 <541 (11)
 5-966 (18)
 10-19118 (32)
 20+149 (40)
Local health department characteristics
LHD jurisdiction population category
 Small (<50 000)119 (32)
 Medium (50 000-199 999)128 (34)
 Large (200 000+)128 (34)
PHAB accredited or preparing to apply
 Currently accredited113 (30)
 Recently applied but not yet accredited42 (11)
 Yes, but have not yet applied84 (22)
 No/unsure136 (36)
Currently participate in academic partnerships
 Yes272 (73)
 No/unsure99 (27)

Abbreviations: LHD, local health department; PHAB, Public Health Accreditation Board.

aRespondents were allowed to select all races/ethnicities with which they identified.

Abbreviations: LHD, local health department; PHAB, Public Health Accreditation Board. aRespondents were allowed to select all races/ethnicities with which they identified.

Statistical analysis

Evidence-based decision making item means, standard deviations, and correlations of items with each other were calculated. A confirmatory factor analysis (CFA) was conducted to confirm the validity of the 6 factors and identify the most parsimonious (ie, simplest) and theoretically sound model. The analytic process was guided by Schumacker and Lomax39 and was performed in MPlus version 8.40 The base model was specified with 6 factors and all 27 items were included in the survey using a robust weighted least squares estimator. Items considered for removal were those that cross-loaded onto other factors on the basis of modification indices, for example, when the highest modification indices included 1 item and factors in which the item was not originally placed. In addition, items that were highly correlated with another item (>0.7) were considered for removal; in this case, the item that had the stronger factor loading was retained. Covariance terms were added on the basis of the modification indices given by MPlus. Several fit indices were used to evaluate model fit: the χ2/df, comparative fit index (CFI), Tucker-Lewis index, and root-mean-square error of approximation (RMSEA) and 90% confidence interval. The CFI values of 0.90 and greater and 0.95 and greater indicate adequate or good fit, respectively, and RMSEA values less than 0.06 or 0.08 indicate good and adequate model fit.41 Correlations between factors were also examined. Factors with correlation coefficients of 0.85 and greater were deemed strongly related.42 Once a final factor structure was identified, standardized factor scores were obtained from MPlus. To examine construct validity of the factor structure, logistic regression models were fit to quantify the associations between continuous EBDM factor scores (independent variables) and delivery of EBIs in SAS version 9.4 (Cary, North Carolina). The dependent variable, number of EBIs delivered of the 4 presented to a respondent, was categorized into 2 levels: 0 to 2 (referent) versus 3 to 4. Odds ratios and 95% confidence intervals were calculated. Several characteristics were identified as potential confounders: jurisdiction size population, PHAB accreditation status, presence of an academic health department partnership, and the respondent's experience in public health. None of these covariates changed the point estimates of the association between the factor scores and EBI delivery or were associated with EBI delivery except PHAB accreditation. Thus, models presented are adjusted for PHAB accreditation status.

Results

The majority of the 376 LHD practitioners were between 40 and 59 years of age (58%), were female (83%), and had been in public health for 10 or more years (72%, Table 1). While most practitioners held a master's degree or higher (58%), only one-third (32%) of all participants held a public health master's or doctoral degree. Most LHDs reported participation in an academic health department partnership (73%), and nearly one-third (30%) were accredited by the PHAB. Comparing LHDs of those who responded to the survey (n = 376) with nonresponders (n = 206), jurisdiction population sizes were similar and similar proportions were in rural jurisdictions. A higher proportion of respondents were from LHDs that were PHAB-accredited, were locally governed, had a local board of health, and used the Community Guide in some areas or consistently across program areas (data not shown). A series of structural equation models were fit to conduct the CFA of 6 factors according to the categories of items on the survey (Table 2). The base model had poor fit according to all indices (χ2 = 1355, RMSEA = 0.096, CFI = 0.921, Table 3). Based on suggested modifications provided by MPlus (ie, items with high modification indices), 5 subsequent modification models were fit. In these models, individual items were removed because of cross-loading onto multiple factors or covariance terms were added between individual items that were related. Details of individual modifications are provided in Supplemental Digital Content Appendix 2, available at http://links.lww.com/JPHMP/A560. The final measure had good fit (χ2 = 569, RMSEA = 0.073, CFI = 0.965) and comprised 6 scales: awareness of EBDM (3 items), capacity for EBDM (7 items), resources availability (3 items), evaluation capacity (3 items), EBDM climate cultivation (3 items), and partnerships to support EBDM (3 items), with a total of 22 items.
TABLE 2

Factor Descriptions and Items

Factor 1: Awareness of culture supportive of EBDM
 Item 1: I am provided the time to identify evidence-based programs and practices.
 Item 2: My direct supervisor recognizes the value of management practices that facilitate EBDM.
 Item 3: My work group/division offers employees opportunities to attend evidence-based decision-making trainings.
 Item 4: Top leadership in my agency (eg, director, assistant directors) recognizes the value of evidence-based decision making.a
Factor 2: Capacity and expectations for EBDM
 Item 5: I use EBDM in my work.
 Item 6: My direct supervisor expects me to use EBDM.
 Item 7: My performance is partially evaluated on how well I use EBDM in my work.
 Item 8: My work group/division currently has the resources (eg, staff, facilities, partners) to support application of EBDM.
 Item 9: The staff in my work group/division has the necessary skills to carry out EBDM.
 Item 10: The majority of my work group/division's external partners support use of EBDM.
 Item 11: Top leadership in my agency encourages use of EBDM.
Factor 3: Resource availability
 Item 12: Informational resources (eg, academic journals, guidelines, and tool kits) are available to my work group/division to promote the use of EBDM.
 Item 13: My work group/division engages a diverse external network of partners that share resources to facilitate EBDM.
 Item 14: Stable funding is available for EBDM.
Factor 4: Evaluation capacity
 Item 15: My work group/division supports community needs assessments to ensure that evidence-based decision-making approaches continue to meet community needs.a
 Item 16: My work group/division plans for evaluation of interventions prior to implementation.
 Item 17: My work group/division uses evaluation data to monitor and improve interventions.
 Item 18: My work group/division distributes intervention evaluation findings to other organizations that can use our findings.
Factor 5: EBDM climate cultivation
 Item 19: My work group/division has access to evidence-based decision making information that is relevant to community needs.a
 Item 20: When decisions are made within my work group/division, program staff members are asked for input.a
 Item 21: Information is widely shared in my work group/division so that everyone who makes decisions has access to all available knowledge.
 Item 22: My agency is committed to hiring people with relevant training or experience in public health core disciplines (eg, epidemiology, health education, environmental health).
 Item 23: My agency has a culture that supports the processes necessary for EBDM.
Factor 6: Partnerships to Support EBDM
 Item 24: Our collaborative partnerships have missions that align with my agency.a
 Item 25: It is important to my agency to have partners who share resources (money, staff time, space, materials).
 Item 26: It is important to my agency to have partners in health care to address population health issues.
 Item 27: It is important to my agency to have partners in other sectors (outside of health) to address population health issues.

Abbreviation: EBDM, evidence-based decision making.

aItems 4, 15, 19, 20, and 24 were removed from the final model solution.

TABLE 3

Goodness of Fit Indices and Modificationsa

90% CI RMSEA
χ2dfPRMSEALowerUpperCFITLIModification note
Base model1355309<.0010.0960.0910.1010.9210.910
Modification 1950260<.0010.0850.0790.0910.9450.9372 items dropped (20 and 24)
Modification 2810237<.0010.0810.0750.0870.9500.9421 item dropped (4)
Modification 3713215<.00010.0790.0730.0860.9540.9461 item dropped (19)
Modification 4637213<.0010.0740.0670.0800.9610.9542 covariance terms added (2 and 6, 8, and 14)
Modification 5569192<.0010.0730.0660.0800.9650.9581 item dropped (15)

Abbreviations: CI, confidence interval; CFI, confirmatory fix index; df, degrees of freedom; RMSEA, root-mean-squared error of approximation; TLI, Tucker-Lewis index.

aGoodness of fit and comparative fit indices are shown for the base model and modifications made to the item structure within the confirmatory factor analysis. Acceptable model fit is indicated by RMSEA <0.06 or 0.08 and CFI and TFI >0.9 or 0.95.42

Abbreviation: EBDM, evidence-based decision making. aItems 4, 15, 19, 20, and 24 were removed from the final model solution. Abbreviations: CI, confidence interval; CFI, confirmatory fix index; df, degrees of freedom; RMSEA, root-mean-squared error of approximation; TLI, Tucker-Lewis index. aGoodness of fit and comparative fit indices are shown for the base model and modifications made to the item structure within the confirmatory factor analysis. Acceptable model fit is indicated by RMSEA <0.06 or 0.08 and CFI and TFI >0.9 or 0.95.42 Factor loadings and cross-factor correlations for the final 6-factor model solution are presented in Table 4. Most items (20 of 22) had high factor loadings of greater than 0.7, with the factor loadings for the remaining 2 items greater than 0.6. This indicates that items fit well on their respective scales; low factor loadings would suggest that an item is out of place on a given factor. Two factors, awareness of EBDM (factor 1) and capacity for EBDM (factor 2), had a markedly higher correlation (r = 0.91) than the other pairs of factors. The lowest cross-factor correlation (r = 0.40) was noted between resource availability (factor 3) and partnerships to support EBDM (factor 6). All other correlations ranged from 0.46 to 0.77.
TABLE 4

Final Model Item–Specific Factor Loadings and Cross-Factor Correlations

Cross-Factor Correlations
Factor LoadingsFactor 1Factor 2Factor 3Factor 4Factor 5Factor 6
Awareness of culture supportive of EBDM (F1)0.910.760.680.770.49
 Item 10.73
 Item 20.75
 Item 30.75
Capacity and expectations for EBDM (F2)0.770.730.720.47
 Item 50.78
 Item 60.80
 Item 70.74
 Item 80.80
 Item 90.79
 Item 100.71
 Item 110.84
Resource availability (F3)0.650.640.40
 Item 120.73
 Item 130.87
 Item 140.63
Evaluation capacity (F4)0.700.46
 Item 160.87
 Item 170.92
 Item 180.80
EBDM climate cultivation (F5)0.57
 Item 210.67
 Item 220.74
 Item 230.92
Partnerships to support EBDM (F6)
 Item 250.75
 Item 260.90
 Item 270.94

Abbreviation: EBDM, evidence-based decision making.

Abbreviation: EBDM, evidence-based decision making. Logistic regression models showed positive relationships between the 6 factors and the number of EBIs delivered (Table 5). Overall, these relationships were similar in strength across the 6 factors (odds ratios ranged from 1.31 to 1.52). The strongest relationship was found for resource availability (factor 3) and number of EBIs delivered, while the weakest relationship occurred between partnerships to support EBDM (factor 6) and number of EBIs delivered, which did not reach statistical significance.
TABLE 5

Relationships Between EBDM Factors and Delivery of Evidence-Based Interventions

Number of EBIs Delivered (0-2 vs 3-4)a
EBDM FactorOR95% CIPb
Awareness of culture supportive of EBDM (F1)1.521.112.09.01
Capacity and expectations for EBDM (F2)1.451.081.95.01
Resource availability (F3)1.481.072.06.02
Evaluation capacity (F4)1.521.161.99.002
EBDM climate cultivation (F5)1.511.072.14.02
Partnerships to support EBDM (F6)1.310.941.83.12

Abbreviations: CI, confidence interval; EBDM, evidence-based decision making; EBIs, evidence-based interventions; OR, odds ratio.

aA summary-dependent variable was created as the number of evidence-based interventions (EBIs) that each local health department (LHD) practitioner reported being delivered by their LHD out of the 4 EBIs presented within the survey (range: 0-4). The variable was categorized into 2 levels: 0 to 2 (n = 154) and 3 to 4 (n = 222). Logistic regression models were fit to examine the relationship between EBDM factors and number of EBIs delivered. The 0 to 2 EBI category was the referent. Models were adjusted for Public Health Accreditation Board accreditation status (binary).

bP < .001.

Abbreviations: CI, confidence interval; EBDM, evidence-based decision making; EBIs, evidence-based interventions; OR, odds ratio. aA summary-dependent variable was created as the number of evidence-based interventions (EBIs) that each local health department (LHD) practitioner reported being delivered by their LHD out of the 4 EBIs presented within the survey (range: 0-4). The variable was categorized into 2 levels: 0 to 2 (n = 154) and 3 to 4 (n = 222). Logistic regression models were fit to examine the relationship between EBDM factors and number of EBIs delivered. The 0 to 2 EBI category was the referent. Models were adjusted for Public Health Accreditation Board accreditation status (binary). bP < .001.

Discussion

The purpose of this study was to develop a measure of organizational support for EBDM and to assess the psychometric properties of the measure. Results from the CFA show that the 6-factor model had good fit and that there is strong evidence of construct validity based on the relationships between the factors and delivery of EBIs. This measure can be used by public health practitioners and researchers while planning for, implementing, and evaluating efforts to increase individual and organizational capacity for EBDM within LHDs. For example, if an LHD completed the survey and scored lower on evaluation capacity than other factors, they could seek opportunities for quality improvement focused on aspects of evaluation capacity (eg, planning for evaluation before implementing an EBI). Using the instrument to evaluate changes in organizational capacity would show whether or not their efforts were successful in improving the LHD's evaluation capacity. This study extends previous work conducted to understand factors related to EBDM within SHD practitioners by Stamatakis and colleagues.29 There are notable differences between SHD and LHD structures and the practitioners within each setting that may need to be accounted for differently in measures. For example, LHD practitioners have backgrounds that are more heterogeneous and are less likely to have formal public health training.32,33 Also, public health governance structures and the relationship between state health and regional or local health departments differ widely across states, which could influence how much autonomy LHDs have to modify their organization's EBDM supports or perhaps the level of support LHDs have been provided by the state to engage in EBDM.31 These differences could impact the way that EBDM operates within an LHD, what organizational supports are needed, how these LHD practitioners support and ensure fidelity of interventions implemented by community lay workers or contract with agencies to do so, and ultimately how EBIs are implemented. The need for a specific LHD measure is also highlighted by differences in the structure of the state versus local assessment. For example, the same leadership item grouped with leadership support and commitment in the SHD survey and items related to capacity for EBDM in the LHD sample. This study also builds on work by Reis and colleagues30 to develop a measure for LHDs, which was tested using a smaller sample (n = 90) to establish initial internal consistency (ie, Cronbach α) and test-retest reliability evidence. Building upon these 2 studies, this study was designed to understand the supports for EBDM in the specific context of LHDs using a larger sample and rigorous evaluation methods (eg, CFA) to develop a survey that incorporates our most up-to-date understanding of EBDM and establish construct validity of the survey (ie, relationships between factors and EBI delivery). The organizational supports for EBDM identified in this study are in line with other factors identified in prior literature. Items in our factors related to evaluation capacity, access to evidence, resource availability, and organizational culture align with important characteristics of organizations identified by studies led by Allen et al and Kramer et al.43–45 In addition, Hu and colleagues46 reported increases in the likelihood of using research evidence with more favorable profiles of organizational supports in a longitudinal study of SHDs, with a particular emphasis on the impact of leadership support. Peirson and colleagues47 found that characteristics of leaders and access to and resources for using evidence are important in building capacity for evidence-informed decision making in Canadian public health units. Evidence-informed decision making is a term used in Canada and Australia to describe a process similar to EBDM while highlighting that public health decisions are based on evidence and real-world context (eg, organizational and political factors).7 Dobbins and colleagues48 demonstrated that an organizational culture supportive of evidence-informed decision making modifies the response of public health agencies to knowledge translation and exchange interventions. Capacity-building efforts should consider these differences and possibly tailor strategies on the basis of an agency's ability to support EBDM. Several limitations should be considered in light of the findings of this study. Survey items were part of a self-report survey of LHD practitioners, which may not fully reflect the organizational attributes of an LHD. Response bias may influence the generalizability of our findings, as a higher proportion of PHAB-accredited LHDs were present in our sample compared with nonrespondents. In addition, our sampling methods may limit how generalizable the sample is to all LHDs in the 2016 NACCHO profile from which our sample was drawn. A lower proportion of LHDs in our sample were from a rural jurisdiction and had a state-governed structure, and a higher proportion of respondents were PHAB accredited and were locally governed compared with other LHDs in the NACCHO profile (data not shown). The difference in representation from rural LHDs likely resulted from our sampling strategy that sampled equal numbers of small, medium, and large LHDs, thereby oversampling larger LHDs. Evidence-based decision making may operate differently in the LHDs in our sample compared with nonrespondents and with other LHDs around the United States. In addition, the high correlation between the awareness and capacity factors (r = 0.91) indicates that these may be representing the same latent factor. While our results suggest a relationship between the organizational supports for EBDM and delivery of EBIs, future studies should assess the construct validity of these factors by investigating relationships with other types of EBIs (eg, colorectal cancer screening) or whether changes in organizational supports can improve LHD performance and EBI delivery. Despite these limitations, our study is strengthened by the theoretical development and empirical testing of the instrument that allowed us to build upon prior research and knowledge of important organizational supports for EBDM. In addition, LHD practitioners in our sample represent a variety of LHDs across the country (ie, sampled from across the United States and from different jurisdiction sizes). The factors identified are potentially modifiable and could be incorporated into public health and research efforts to improve EBDM within LHDs. Currently, there are few strategies for modifying organizational supports for EBDM with demonstrated effectiveness. Brownson et al24 used EBDM training and a supplemental technical assistance approach to improve EBDM within SHDs and found improvements on only 1 of 5 organizational factors (ie, access to evidence and skilled staff). Changing organizational-level factors is made more challenging due to staff turnover, competing priorities, and a lack of incentive to institute changes.49 While it will require a significant, long-term commitment from LHD leaders,50 building organizational capacity for EBDM is crucial for health departments to fulfill their role in population-level chronic disease control. Future work should investigate what is needed to make meaningful changes to an organization's ability to support EBDM. This study adds to the growing body of literature on measuring and promoting EBDM within public health settings so that evidence-based programs and policies can be most efficiently and effectively translated into practice. Measures with sound psychometric properties are critical to understanding how public health departments support EBDM, evaluating interventions aimed at improving the capacity of LHDs to support EBDM, and guiding the development of evidence-based policies to support EBDM within LHDs. These efforts can enhance the ability to translate research into public health practice effectively, the overall performance of LHDs, and eventually the health of the populations they serve.
  29 in total

1.  Barriers to evidence-based decision making in public health: a national survey of chronic disease practitioners.

Authors:  Julie A Jacobs; Elizabeth A Dodson; Elizabeth A Baker; Anjali D Deshpande; Ross C Brownson
Journal:  Public Health Rep       Date:  2010 Sep-Oct       Impact factor: 2.792

Review 2.  Evidence-based public health: a fundamental concept for public health practice.

Authors:  Ross C Brownson; Jonathan E Fielding; Christopher M Maylahn
Journal:  Annu Rev Public Health       Date:  2009       Impact factor: 21.981

3.  Examining the role of training in evidence-based public health: a qualitative study.

Authors:  Elizabeth A Baker; Ross C Brownson; Mariah Dreisinger; Leslie D McIntosh; Ajlina Karamehic-Muratovic
Journal:  Health Promot Pract       Date:  2009-07

4.  Characteristics of Academic Health Departments: Initial Findings From a Cross-Sectional Survey.

Authors:  Paul Campbell Erwin; Patrick Barlow; Ross C Brownson; Kathleen Amos; C William Keck
Journal:  J Public Health Manag Pract       Date:  2016 Mar-Apr

5.  Evidence-based public health practice among program managers in local public health departments.

Authors:  Paul Campbell Erwin; Jenine K Harris; Carson Smith; Carolyn J Leep; Kathleen Duggan; Ross C Brownson
Journal:  J Public Health Manag Pract       Date:  2014 Sep-Oct

6.  Use of evidence-based interventions in state health departments: a qualitative assessment of barriers and solutions.

Authors:  Elizabeth A Dodson; Elizabeth A Baker; Ross C Brownson
Journal:  J Public Health Manag Pract       Date:  2010 Nov-Dec

7.  Results from a psychometric assessment of a new tool for measuring evidence-based decision making in public health organizations.

Authors:  Katherine A Stamatakis; Adriano Akira Ferreira Hino; Peg Allen; Amy McQueen; Rebekah R Jacob; Elizabeth A Baker; Ross C Brownson
Journal:  Eval Program Plann       Date:  2016-08-12

8.  Improving public health system performance through multiorganizational partnerships.

Authors:  Glen P Mays; F Douglas Scutchfield
Journal:  Prev Chronic Dis       Date:  2010-10-15       Impact factor: 2.830

9.  Measurement properties of a novel survey to assess stages of organizational readiness for evidence-based interventions in community chronic disease prevention settings.

Authors:  Katherine A Stamatakis; Amy McQueen; Carl Filler; Elizabeth Boland; Mariah Dreisinger; Ross C Brownson; Douglas A Luke
Journal:  Implement Sci       Date:  2012-07-16       Impact factor: 7.327

10.  Promoting state health department evidence-based cancer and chronic disease prevention: a multi-phase dissemination study with a cluster randomized trial component.

Authors:  Peg Allen; Sonia Sequeira; Rebekah R Jacob; Adriano Akira Ferreira Hino; Katherine A Stamatakis; Jenine K Harris; Lindsay Elliott; Jon F Kerner; Ellen Jones; Maureen Dobbins; Elizabeth A Baker; Ross C Brownson
Journal:  Implement Sci       Date:  2013-12-13       Impact factor: 7.327

View more
  10 in total

1.  How to "Start Small and Just Keep Moving Forward": Mixed Methods Results From a Stepped-Wedge Trial to Support Evidence-Based Processes in Local Health Departments.

Authors:  Rebekah R Jacob; Renee G Parks; Peg Allen; Stephanie Mazzucca; Yan Yan; Sarah Kang; Debra Dekker; Ross C Brownson
Journal:  Front Public Health       Date:  2022-04-28

2.  Perspectives on program mis-implementation among U.S. local public health departments.

Authors:  Peg Allen; Rebekah R Jacob; Renee G Parks; Stephanie Mazzucca; Hengrui Hu; Mackenzie Robinson; Maureen Dobbins; Debra Dekker; Margaret Padek; Ross C Brownson
Journal:  BMC Health Serv Res       Date:  2020-03-30       Impact factor: 2.655

3.  Local Health Department Accreditation Is Associated With Organizational Supports for Evidence-Based Decision Making.

Authors:  Peg Allen; Stephanie Mazzucca; Renee G Parks; Mackenzie Robinson; Rachel G Tabak; Ross Brownson
Journal:  Front Public Health       Date:  2019-12-17

4.  Force field analysis of driving and restraining factors affecting the evidence-based decision-making in health systems; comparing two approaches.

Authors:  Tahereh Shafaghat; Mohammad Kazem Rahimi Zarchi; Mohammad Hasan Imani Nasab; Zahra Kavosi; Mahammad Amin Bahrami; Peivand Bastani
Journal:  J Educ Health Promot       Date:  2021-11-30

5.  Impact of Contextual Factors on the Attendance and Role in the Evidence-Based Chronic Disease Prevention Programs Among Primary Care Practitioners in Shanghai, China.

Authors:  Xin Liu; Xin Gong; Xiang Gao; Zhaoxin Wang; Sheng Lu; Chen Chen; Hua Jin; Ning Chen; Yan Yang; Meiyu Cai; Jianwei Shi
Journal:  Front Public Health       Date:  2022-02-02

6.  Diabetes Prevention and Care Capacity at Urban Indian Health Organizations.

Authors:  Meredith P Fort; Margaret Reid; Jenn Russell; Cornelia J Santos; Ursula Running Bear; Rene L Begay; Savannah L Smith; Elaine H Morrato; Spero M Manson
Journal:  Front Public Health       Date:  2021-11-26

Review 7.  Developing criteria for research translation decision-making in community settings: a systematic review and thematic analysis informed by the Knowledge to Action Framework and community input.

Authors:  Marilyn E Wende; Sara Wilcox; Zoe Rhodes; Deborah Kinnard; Gabrielle Turner-McGrievy; Brooke W McKeever; Andrew T Kaczynski
Journal:  Implement Sci Commun       Date:  2022-07-16

Review 8.  Centering equity and lived experience: implementing a community-based research grant on cannabis and mental health.

Authors:  Pamela Obegu; Julia Armstrong; Mary Bartram
Journal:  Int J Equity Health       Date:  2022-08-20

9.  General practitioners' perceptions of their practice of evidence-based chronic disease prevention interventions: a quantitative study in Shanghai, China.

Authors:  Feng Fan; Zhaoxin Wang; Dehua Yu; Chen Chen; Delei Shen; Zhaohu Yu; Xin Liu; Huining Zhou; Jianwei Shi
Journal:  BMC Fam Pract       Date:  2020-07-22       Impact factor: 2.497

10.  Use and Awareness of The Community Guide in State and Local Health Department Chronic Disease Programs.

Authors:  Emily Rodriguez Weno; Stephanie Mazzucca; Renee G Parks; Margaret Padek; Peg Allen; Ross C Brownson
Journal:  Prev Chronic Dis       Date:  2020-10-22       Impact factor: 2.830

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.