| Literature DB >> 16356372 |
Goldie MacDonald1, Danyael Garcia, Stephanie Zaza, Michael Schooley, Don Compton, Terry Bryant, Lulu Bagnol, Cathy Edgerly, Rick Haverkate.
Abstract
The Steps to a HealthierUS Cooperative Agreement Program (Steps Program) enables funded communities to implement chronic disease prevention and health promotion efforts to reduce the burden of diabetes, obesity, asthma, and related risk factors. At both the national and community levels, investment in surveillance and program evaluation is substantial. Public health practitioners engaged in program evaluation planning often identify desired outcomes, related indicators, and data collection methods but may pay only limited attention to an overarching vision for program evaluation among participating sites. We developed a set of foundational elements to provide a vision of program evaluation that informs the technical decisions made throughout the evaluation process. Given the diversity of activities across the Steps Program and the need for coordination between national- and community-level evaluation efforts, our recommendations to guide program evaluation practice are explicit yet leave room for site-specific context and needs. Staff across the Steps Program must consider these foundational elements to prepare a formal plan for program evaluation. Attention to each element moves the Steps Program closer to well-designed and complementary plans for program evaluation at the national, state, and community levels.Entities:
Mesh:
Year: 2005 PMID: 16356372 PMCID: PMC1509370
Source DB: PubMed Journal: Prev Chronic Dis ISSN: 1545-1151 Impact factor: 2.830
Foundational Elements for Program Evaluation Planning, Implementation, and Use of Findings, Steps to a HealthierUS Cooperative Agreement Program
|
|
|
|
|---|---|---|
| 1. Distinguish between research and program evaluation. | Understanding differences between research and program evaluation encourages consideration of more options for the evaluation of public health programs. Research and program evaluation differ along 10 critical dimensions: planning, decision making, standards, questions, design, data collection, analysis and synthesis, judgments, conclusions, and uses. | Guidelines for defining public health research and public health non-research ( |
| 2. Define program evaluation. | The Steps Program defines program evaluation as "the systematic collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program effectiveness, and/or inform decisions about future program development" ( | Mathison ( |
| 3. Use the | The | Framework for program evaluation in public health ( |
| 4. Seek cultural competence in program evaluation planning, implementation, and use of findings. | Program evaluation should be responsive to cultural context, use appropriate frameworks and methodology, and rely on "stakeholder-generated, interpretive means to arrive at the results and further use of findings" ( | Frierson et al ( |
| 5. Prepare a program logic model as a platform for evaluation planning, implementation, and use of findings. | A logic model makes visible the underlying theory of the program or intervention and connects resources invested with expected results. A logic model includes inputs, activities, outputs, and outcomes (short-term, intermediate, and long-term). A well-designed logic model guides program planning, evaluation, management, and communications. | McLaughlin and Jordan ( |
| 6. Identify the purpose of the evaluation. | An explicit statement of the evaluation's purpose focuses and clarifies the planning process. After the purpose of the evaluation is agreed upon, subsequent decisions can be made more easily (e.g., allocation of resources, identification of evaluation questions, selection of data collection methods). | The program evaluation standards ( |
| 7. Identify intended users and uses of the evaluation. | Identification of intended users and uses is a necessary component of appropriate evaluation design. Users and uses must be prioritized so that resources for specific tasks can be allocated strategically. | Patton ( |
| 8. Identify key evaluation questions. | Evaluation questions follow from the stated purpose, users, and intended use of findings. Evaluation questions should be made explicit so that data collected meet the information needs of program stakeholders. | Frechtling and Sharp ( |
| 9. Attend to process and outcome evaluation. | Program processes are linked to outcomes by the theory of change presented in a program logic model. Outcome measures cannot demonstrate why or how a program works or does not work. Knowing why and how a program brings about desired outcomes is as important as knowing whether a desired outcome occurred. | Health promotion evaluation: recommendations to policy-makers ( |
| 10. Maximize use of existing surveillance systems for outcome measurement. | Evaluation of public health programs is often more efficient when existing surveillance data are used for outcome measurement. Use of these data enhances consistency in measurement and comparability among participating sites and relevant national estimates. | Indicators for chronic disease surveillance ( |