| Literature DB >> 35461229 |
Lisa M H Sanetti1, Alexandra M Pierce2, Lauren Gammie2, Alicia G Dugan3, Jennifer M Cavallari3.
Abstract
BACKGROUND: Teachers have high rates of daily stress and the majority of available interventions are focused at the teacher-level. Yet, best practices in Total Worker Health® approaches indicate organization-level interventions identified using a participatory approach are most effective. We conducted an exploratory scale-out pilot study to examine the adoption of the Healthy Workplace Participatory Program (HWPP), an evidence-based, Total Worker Health approach to engage employees (e.g., teachers) and supervisory personnel (e.g., administrators) in the design and implementation of workplace well-being interventions within two elementary schools.Entities:
Keywords: Fidelity; Scale-out; Teacher well-being; Total Worker Health®
Mesh:
Year: 2022 PMID: 35461229 PMCID: PMC9034693 DOI: 10.1186/s12889-022-13241-6
Source DB: PubMed Journal: BMC Public Health ISSN: 1471-2458 Impact factor: 4.135
The Healthy Workplace Participatory Program IDEAS Process
| IDEAS Step and Objective | Goal | Activities |
|---|---|---|
| Identify the root causes of a top health and wellness concern by generating a list of factors that contribute to or cause the concern | • The “All Employee Survey” that measures physical and psychological health (e.g., sleep, stress), interpersonal relationships (e.g., leadership, affiliation, role conflict), and work-related factors (e.g., autonomy, workload) is used to gather data on employee concerns, all employees complete. • Employee input can also be obtained through focus groups using HWPP-specific focus group protocols. • Design Team members (a) review the All Employee Survey results, focus group result, or both; (b) write their top three concerns individually on post-it notes; and (c) place the post-it notes on a flip chart. • The facilitator guides the Design Team through grouping concerns into related concerns. • Once all areas of concern are identified, then each Design Team member is given three dot stickers, which they use to vote for their top concerns. Design Team members place each dot sticker on the flip chart near the area(s) of concern they want to vote for as most important. Each member may distribute their dot stickers however they would like (e.g., all three stickers on one concern, one sticker for three different concerns). • The Design Team (a) reviews voting results, (b) identifies the top priority concern, and (c) creates a fishbone diagram to map the root causes of the concern. | |
| Using the information from the root cause analysis, identify solutions that result in full or partial resolution of the top concern. | • The Design Team develops an objective that represents a realistic and meaningful improvement (e.g., reduce the number of teachers indicating low levels of personal accomplishment). • The Design Team brainstorms solutions (e.g., increased structured public and private noticing of accomplishments) and specific activities (e.g., daily recording of personal “wins,” staff “shout outs” submitted to office and announced to whole school daily) to enact each solution. The facilitator records all solutions and activities. | |
| Identify criteria, or key performance indicators, to consider when evaluating any intervention. | • Led by the facilitator, the Design Team determines what (a) the scope of the intervention should be (e.g., how many employees will it reach?), (b) measurable benefits should result from the intervention, (c) resources are available for the intervention, and (d) possible barriers to the intervention exist. Interventions that meet most or all selection criteria are deemed to have the greatest likelihood of success. | |
| Group multiple solutions to form interventions, and then rate and prioritize three intervention options that can be considered for adoption by the Steering Committee. | • The Design Team reviews brainstormed solutions and associated activities and groups them into multi-component interventions. These interventions may be distinct from one another, or the Design Team may choose to present a “basic essentials” intervention, a fully comprehensive intervention, and an intervention that includes the basic essentials and a few additional activities. | |
| Rate the three prioritized intervention options using each key performance indicator. | • The facilitator leads the Design Team in analyzing each intervention against the key performance indicators (i.e., scope, benefits, resources, barriers) identified in Step 3. • The Design Team rates each intervention as low, medium, or high on each indicator. | |
| Design Team presents their work to the Steering Committee who rates and approves the intervention(s) to be implemented. | • The Design Team presents the interventions to the Steering Committee and answers questions about their process and resulting interventions. • The Steering Committee rates each intervention as low, medium, or high on scope, benefit, resources, and obstacles. The Steering Committee may provide feedback to the Design Team and request further revisions to the interventions or select the intervention(s) as they are presented. | |
| The Steering Committee develops an implementation plan for the selected intervention(s). | • It is generally recommended that the Steering Committee prioritize the sequence of intervention activities, identify personnel who should be involved, provide training if needed, and develop a communication plan. Few resources and little guidance are provided in the IDEAS Tool for specific actions at this step, as the intervention planning and implementation will be unique to each context and guided by implementation science. | |
| Collect data on intervention implementation and effectiveness. | • The IDEAS tool provides few resources and little guidance for specific actions in this step, as evaluation will be unique to each intervention and implementation context. | |
Alignment of HWPP Components across Sessions for Schools 1 and 2
| HWPP Componentsa in School 1 | IDEAS Step | HWPP-E Components for School 2 | ||
|---|---|---|---|---|
| Sessions 1–8 | Introductions, Introductory questions | Sessions 1–5 | Introductions kept for Session 1; removed for all other sessions. | |
| Session 1 | Reflection on TWH | Start-up 1 | Session 1 | Removed |
| Definition of TWH | Brief discussion | |||
| HWPP roles | Brief discussion | |||
| HWPP processes and IDEAS tool | Brief discussion | |||
| Ground Rules | Maintained | |||
| Session 2 | Brainstorming on health, safety, wellbeing in workplace | Start-up 2 | Removed | |
| Ideal workplace brainstorming | Removed | |||
| Session 3 | Identify top 3–4 health and safety concerns individually | Start-up 3 | Kept | |
| Group concerns by theme | Start-up 3 | Kept | ||
| Vote to identify top priorities | Start-up 3 | Kept | ||
| Session 4 | Root Causes Analysis | Step 1 | Session 2 | Kept |
| Session 5 | Set Measurable Objective | Step 2 | Session 3 | Kept |
| Brainstorm Solutions | Step 2 | |||
| Session 6 | Establish Criteria for Evaluating Interventions | Step 3 | Session 4 | Revised to a brief discussion |
| Session 7 | Create 3 intervention options | Step 4 | Kept, but number of interventions not specified | |
| Apply selection criteria to solution activities | Step 5A | Revised to describe the scope, benefits, resources | ||
| Session 8 | Present to Steering Committee | Step 5B | Session 5 | Revised-presented to Principal; identified SC members based on intervention options selected |
aPer August 2017 version of HWPP Facilitator Manual
Alignment of Implementation Outcomes and Measures
| Implementation Outcome | Measure(s) |
|---|---|
| Fidelity | Dosage: Duration and frequency of meetings Exposure: Extent of meeting attendance of Design Team members Adherence: % of IDEAS Process Steps 1–5 completed |
| Acceptability | URP-IR Acceptability subscale |
| Understanding | URP-IR Understanding subscale |
| Feasibility | URP-IR Feasibility subscale |
| System Alignment | CPH-NEW Process Evaluation URP-IR System Climate subscale URP-IR System Support subscale |
Fig. 1HWPP and HWPP-E Fidelity Data
CPH-NEW Process Evaluation Ratings by School
| Subscalea | School 1 | School 2 | ANOVA |
|---|---|---|---|
| Organizational Support and Engagement | 13.4 (6.15) | 17.1 (1.68) | 0.18 |
| Design Team Engagement | 17.0 (3.32) | 18.7 (1.11) | 0.25 |
| Program Facilitation Effectiveness | 18.8 (1.79) | 19.1 (0.90) | 0.72 |
a All items of the CPH-NEW Process Evaluation Rating were assessed on a 5-point Likert scale (1 = Strongly Disagree to 5 = Strongly Agree); each subscale includes 4 items for possible score range of 4 to 20
Usage Rating Profile-Intervention (URP-IR) by School
| URP-IR | School 1 | School 2 | ANOVA |
|---|---|---|---|
| Acceptability | 4.19 (1.13) | 5.12 (0.72) | 0.02 |
| Understanding | 4.08 (0.79) | 5.67 (0.48) | 0.002 |
| Feasibility | 3.88 (0.99) | 5.33 (0.53) | 0.003 |
| System Climate | 4.15 (1.14) | 4.54 (0.56) | 0.30 |
| System Support | 3.67 (1.15) | 2.57 (0.68) | 0.008 |
a All items of the URP-IR were assessed on a 6-point Likert scale (1 = Strongly Disagree to 6 = Strongly Agree)
Summary of HWPP Adaptations and Modifications for School 2 based on Implementation Outcome Data from School 1
| Who Participated | Goal | Rating Scales | Design Team | Study 1 Related Fidelity Data | What is modified | At what level of delivery | Nature of modification | Relationship to fidelity / core elements |
|---|---|---|---|---|---|---|---|---|
| Researchers | Improve fit with recipients | Total time required to implement the process was lowest rated item on URP-IR Feasibility subscale. | ∙ Too many meetings. ∙ It took too long to identify and implement intervention. ∙ It was demoralizing to reflect on TWH and Ideal Workplace when not able to be obtained. ∙ It took too long to identify and implement a stress-reduction intervention. | Exposure: Design Team increasingly members skipped meetings or left early. Adherence: Data were variable for some meetings in which content did not take the projected amount of time. | Decreased number of meetings-combined 8 meetings into 5 (see Table Maintained only core components of IDEAs process. | Group | ∙ Removing elements ∙ Substituting content ∙ Shortening / condensing some components | Fidelity consistent |
| Researchers, Administrator | Increase retention & satisfaction | Total time required to implement the process was lowest rated item on URP-IR Feasibility subscale. | ∙ The duration of meetings was not sustainable. | Dosage / Exposure: Design Team members increasingly skipped meetings or left early. | Decreased duration of meetings from 2 h to 1 h. | Group | ∙ Shortening / condensing pacing | Fidelity consistent |