| Literature DB >> 23721789 |
Dayna Albert1, Rebecca Fortin, Anne Lessio, Christine Herrera, Barbara Riley, Rhona Hanning, Brian Rush.
Abstract
Best practices identified solely on the strength of research evidence may not be entirely relevant or practical for use in community-based public health and the practice of chronic disease prevention. Aiming to bridge the gap between best practices literature and local knowledge and expertise, the Ontario Public Health Association, through the Toward Evidence-Informed Practice initiative, developed a set of resources to strengthen evidence-informed decision making in chronic disease prevention programs. A Program Assessment Tool, described in this article, emphasizes better processes by incorporating review criteria into the program planning and implementation process. In a companion paper, "Strengthening Chronic Disease Prevention Programming: The Toward Evidence-Informed Practice (TEIP) Program Evidence Tool," we describe another tool, which emphasizes better evidence by providing guidelines and worksheets to identify, synthesize, and incorporate evidence from a range of sources (eg, peer-reviewed literature, gray literature, local expertise) to strengthen local programs.The Program Assessment Tool uses 19 criteria derived from literature on best and promising practices to assess and strengthen program planning and implementation. We describe the benefits, strengths, and challenges in implementing the tool in 22 community-based chronic disease prevention projects in Ontario, Canada. The Program Assessment Tool helps put best processes into operation to complement adoption and adaptation of evidence-informed practices for chronic disease prevention.Entities:
Mesh:
Year: 2013 PMID: 23721789 PMCID: PMC3675807 DOI: 10.5888/pcd10.120106
Source DB: PubMed Journal: Prev Chronic Dis ISSN: 1545-1151 Impact factor: 2.830
Program Assessment Tool Review Criteria, the Toward Evidence-Informed Practice Initiative, Ontario, Canada
| Review Criterion | Description |
|---|---|
|
| |
| 1. Needs assessment | The program responds to demonstrated wants and/or needs of the primary audience. |
| 2. Duplication avoidance/environmental scan | The program fills a unique need in the community/setting that is not met by other programs/services. |
|
| |
| 3. Theory and literature evidence | The program is informed by appropriate theoretical concepts (eg, behavior change theory, social learning, risk reduction) and credible, relevant, and realistic sources of evidence (eg, critically appraised academic literature, credible gray literature, expert advice). |
| 4. Program objectives and logic model | The program has appropriate SMART objectives (Specific, Measurable, Appropriate, Realistic, and Timed) as part of its program logic model. |
| 5. Environmental support | The program creates physical and social environments that support healthy behaviors (eg, walking trails, bicycle racks at worksites, healthy food choices in restaurants and vending machines). |
| 6. Policy | The program develops and, as appropriate, implements policy. Policy refers to changing formal or informal rules of governing bodies to support healthy behaviors, both development and implementation. Policy efforts can be directed at the municipal level (eg, by-laws) and/or the institutional level (eg, school or worksite policy). |
| 7. Sequencing | The program is sequenced appropriately. Sequencing refers to the building of program activities on each other over time, to maximize population impact (eg, awareness, skill building, environmental support, policy development). |
|
| |
| 8. Collaboration | The program can be described as collaborative. There is active involvement of local individuals, groups, and intended audiences in program planning and implementation. The right partners are engaged. |
| 9. Mobilization of community resources | The program identifies and uses resources from within the community/setting. |
| 10. Community engagement | The program engages individuals from the community or setting with the objective of consulting, animating, or sensitizing them to the issue (ie, fostering community buy-in). |
| 11. Sustainability | The program, or aspects of the program, can be maintained in the community or setting over time, without dependence on “one-time” or special resources. The Heart Health Resource Centre’s Sustainability Model considers 4 components: issue, program, behavior change, and partnerships. |
| 12. Visibility | The program demonstrates widespread promotion of the program in the community or setting, and/or those delivering the program are highly visible. |
| 13. Opinion leader support | The program demonstrates or has the potential to elicit the active support and buy-in of formal or informal opinion leaders in the community or setting where it is delivered. |
|
| |
| 14. Formative evaluation/pilot testing/focus testing | The program uses formative evaluation (eg, focus groups, structured surveys, key informant interviews, pretesting) to assess the relevance, comprehension, and acceptability of activities, materials, and methods for the intended audience or population of interest. |
| 15. Process evaluation | The program uses process evaluation to gather feedback, demonstrating that intended audiences were reached and the program was implemented as planned. |
| 16. Outcome evaluation | The program evaluates outcome. It assesses the extent to which the program met stated goals and objectives, as assessed by impact and/or outcome evaluations (eg, changes in behaviors, policies, or supportive environments). |
| 17. Program documentation | The program can be replicated at the same and/or new locations. The presence of program implementation guidelines is a necessary condition for meeting this criterion. |
| 18. Context documentation | The program documents its context completely. The extent to which community-specific contextual factors have been analyzed and documented (eg, relevant municipal policies, local demographics, regional socioeconomic factors, profile of intended audience). |
| 19. Cost-benefit | The program weighs the costs against the benefits and concludes that the potential impact (benefits) of the program is worth the estimated costs. |
Program Assessment Tools Guidelines, Worksheets, and Intended Results, the Toward Evidence-Informed Practice Initiative, Ontario, Canada
| Guidelines | Worksheets | Intended Results |
|---|---|---|
| 1. Select program and assign roles | Roles assignment worksheet | • Program needing improvement is identified |
| 2. Gather program information: program informant documents current program activities in 4 categories: program need, program content, program process and program evaluation | Program information survey | • Complete set of program documentation is available |
| 3. Assess the program: independent reviewers assess the program using information collected in the survey against corresponding “best and promising practices” criteria | Program assessment worksheet | • Initial assessment of program against 19 evidence-informed criteria is completed |
| 4. Achieve consensus: conduct a consensus meeting, whereby reviewers discuss and reach consensus on a rating and suggestions for program enhancement for each criterion | Consensus summary sheet | • Areas for program improvement are agreed upon by all reviewers and understood by program staff |
| 5. Select suggestions to implement: program stakeholders select priority program enhancements and develop a work plan for implementation | Program enhancement work plan | • Feasible plan to implement priority program improvements is developed |
Figure 1The worksheet summarizes, for each role needed within an assessment, the person responsible and the person’s contact details.
Figure 4The consensus summary worksheet summarizes the rankings of individual reviewers and their suggestions for enhancement, according to the 19 review criteria, as exemplified here for criteria 3 and 4.
Figure 5The program enhancement work plan is used to summarize the actions, resource needs, accountability and timelines for each priority for program enhancement.