| Literature DB >> 31221201 |
Enola Proctor1, Alex T Ramsey2, Matthew T Brown3, Sara Malone4, Cole Hooley4, Virginia McKay4.
Abstract
BACKGROUND: Effective leadership for organizational change is critical to the implementation of evidence-based practices (EBPs). As organizational leaders in behavioral health organizations often are promoted from within the agency for their long-standing, effective work as counselors, they may lack formal training in leadership, management, or practice change. This study assesses a novel implementation leadership training designed to promote leadership skills and successful organizational change specific to EBP implementation.Entities:
Keywords: Behavioral health; Evaluation; Implementation; Leadership; Practice; Practice change; Training
Mesh:
Year: 2019 PMID: 31221201 PMCID: PMC6585005 DOI: 10.1186/s13012-019-0906-2
Source DB: PubMed Journal: Implement Sci ISSN: 1748-5908 Impact factor: 7.327
Participant demographics
| All | Quantitative | Qualitative | |
|---|---|---|---|
| Participants | 16 | 12 | 9 |
| Agencies | 8 | 7 | 5 |
| % Female | 75% | 67% | 78% |
| % Caucasian | 94% | 92% | 89% |
| Mean age | 37 | 35 | 41 |
| Professional roles | Clinical managers, quality improvement coordinators, and program directors | Clinical managers, quality improvement coordinator, program directors, Associate CEO, addiction counselor | Clinical managers and program directors |
Curriculum at a glance
| Session 1—setting the stage/planning and engaging in change | Session 2—making it happen/executing the change | Session 3—keep improving/evaluating and reflecting on the change |
|---|---|---|
| Who is involved in practice improvement | Review of work since session 1 and 3 shared experiences from your site. | Principles of a learning organization |
| Organizational factors in the implementation of EBPs | Theories, models, and frameworks in EBP | Break-out sessions reporting on successes and challenges |
| Key steps in implementation | Leadership for change | Economics/cost evaluation |
| Team activity and reporting | Strategies for implementing change | Team activity and reporting |
| Discuss plans for small-scale implementation rialing and interim coaching | Team activity and reporting | Quality improvement and performance management |
| Post-training evaluations and ‘What is next’ |
Qualitative codebook of evaluation dimensions
| Dimension | Original description | Sub-category | Application to TRIPLE | Relevant quantitative data |
|---|---|---|---|---|
| Reaction | How participants feel about the training | 78.6% of respondents rated high levels of acceptability and appropriateness | ||
| Receptivity to the training | Participants reactions to the training prior to attending | |||
| Positive about the training | Positive comments participants made about the training. | |||
| Negative or neutral about the training | Negative or neutral comments participants made about the training. | |||
| Learning | Increase in participant knowledge | Participant comments regarding things they learned in the training, or how the training changed their way of thinking. | Significant increase in self-competence (knowledge subscale of ILS), with large effect size | |
| Behavior | Application of knowledge in job | Participant reports of applying their knowledge in their agency or changing how they approach their job based on what they learned in the training. | Significant increase in resilient behaviors (perseverant subscale of ILS), with large effect size | |
| Results | Effect on the business or environment | Significant increases in supporting, rewarding, and valuing expertise in EBP implementation (educational support, recognition, and selection subscales of ICS), with medium to large effect sizes | ||
| Changes in services delivery or practice | Attempting to implement or successfully implementing new EBPs/considering the evidence-base for current practices and making adjustments to achieve fidelity. | |||
| Changes in agency culture or climate regarding EBP | Changes in how other administrative staff view EBP or implement services. Also, includes enhanced collaboration among staff. | |||
| Improved staff knowledge/training | Changes made to help frontline staff learn more about EBPs, or understand why the agency uses EBPs. | |||
| Changes in evaluative approaches | Changes in data collection forms and how evaluation is approached within the agency. |
Pre-test/post-test change in the mean of sum scores (SD) for key outcomes
| ILS | ICS | ORIC | TAAS | |
|---|---|---|---|---|
| Pre-test | 31.31 (7.94) | 40.00 (10.37) | 33.83 (9.09) | |
| Post-test | 37.31 (4.44) | 46.67 (8.00) | 36.75 (5.28) | 45.71 (8.10) |
| Mean diff (95%CI) | 6.00 (1.31–10.69) | 6.67 (1.46–11.87) | 2.92 (2.32–8.15) | |
| 0.0165 | 0.0167 | 0.2457 |
Note: all items are based on a sum score. The total points possible for each scale: ILS = 48 points, ICS = 72 points, ORIC = 48 points, and TAAS = 56
ILS Implementation Leadership Scale
ICS Implementation Climate Scale
ORIC Organizational Readiness for Implementing Change scale
TAAS Training Acceptability and Appropriateness Scale
Qualitative codebook
| Main categorical code | Thematic code | Description | Example |
|---|---|---|---|
| Reaction—overall perception of the course | Positive receptivity to the training. | Reactions to the opportunity to participate in the training. | |
| Positive about training | Positive comments about the course, the materials, and the structure of the course. Can also include comments about things that were done well, and not necessarily be overtly positive. | “I really enjoyed it.” “I liked the time frame.” “You were pretty thorough about the getting out the information.” | |
| Negative/or suggested change | Negative comments about the course, the material, and the structure of the course. Can also include comments about things that could be done differently or improved. | “I did not always know, or felt like it was completely applicable to the work we do here.” | |
| Learning—knowledge gained; attitude and belief changes; change in awareness. | Knowledge gained, changed perceptions about EBP, or more consideration of EBPs as part of service delivery. Changes in thinking specific to the individual. | “My knowledge based was certainly improved.” “I think I’m more aware of it. For example..” “It made me more mindful of buy-in.” | |
| Behavior—changes in how the participants behave in their job role | Changes within the participant in communication style, efforts to improve buy-in, planning, strategizing, etc. | “It helped me to think more deeply…and to plan for some of the barriers and some of the things that get in the way of successfully implementing a practice.” “As I was training people, there were some people that did not really this it was a good idea…it has changed how I approach some people in training, to work more not so much on the ins and outs, but why we are doing it.” “Having more intentional conversations.” | |
| Results—changes at the agency level | Changes in service delivery or practice. | Attempting to implement or successfully implementing new EBPs. Considering the evidence-base for current practices and making adjustments to achieve fidelity. | “We’re looking a little more critically at some of the stuff that we implement…” “Revamping what is already existing.” |
| Changes in agency culture or climate regarding EBP | Changes in how other administrative staff view EBP or implement services. Enhanced collaboration among staff. | “I think there is more buy-in.” “We all just have a heightened awareness of how we are approaching and how we are implementing things rather than just jumping in.” | |
| Improved staff knowledge/training | Changes made to help frontline staff learn more EBPs, or understand why the agency uses EBPs | “Helped new staff get training in different evidence-based practices starting at hiring has been improved.” “We have been spending more time on different tips and techniques in our weekly staff meetings.” | |
| Changes in evaluative approaches | Changes in data collection forms and how evaluation is approaches within the agency. | “we have streamlined our registration forms.” “The way that we evaluated or programs, and look at our data, in terms of outcomes and what is the best things for our clients.” | |
| Other | All other changes that do not fit into the other categories. | ||
| Barriers | Factors that inhibited, slowed, or made change difficult either at the participant or organizational level. | “I was a little frustrated by the (lack of) resources for client strategies.” “There is just so many moving parts that sometimes making a change…is a hard thing to do….” |