| Literature DB >> 35765107 |
Victoria C Scott1, Zara Jillani2, Adele Malpert3, Jenny Kolodny-Goetz3, Abraham Wandersman3.
Abstract
BACKGROUND: Although the benefits of evidence-based practices (EBPs) for advancing community outcomes are well-recognized, challenges with the uptake of EBPs are considerable. Technical assistance (TA) is a core capacity building strategy that has been widely used to support EBP implementation and other community development and improvement efforts. Yet despite growing reliance on TA, no reviews have systematically examined the evaluation of TA across varying implementation contexts and capacity building aims. This study draws on two decades of peer-reviewed publications to summarize the evidence on the evaluation and effectiveness of TA.Entities:
Keywords: Capacity building; Scoping review; TA; Technical assistance; Technical assistance effectiveness; Technical assistance evaluation
Year: 2022 PMID: 35765107 PMCID: PMC9238031 DOI: 10.1186/s43058-022-00314-1
Source DB: PubMed Journal: Implement Sci Commun ISSN: 2662-2211
Eligibility criteria
| Inclusion criteria | Exclusion criteria |
|---|---|
Individuals, organizations, or community receiving TA services | • Articles published in languages other than English • Articles published before 2000 • Non-peer-reviewed articles • Peer-reviewed studies identified as a validation study, review, study protocol, trial registration, or any non-empirical study |
Peer-reviewed studies with a specific focus on the formative, process, or summative evaluation of TA that include both a description of the TA approach (activities or core elements) and TA output or outcomes data. | |
The setting where TA is an intervention for capacity building or improvement. |
Data charting form: sample attributes
| Study characteristics |
First author’s last name Publication year Study location TA aim and activities Area of practice (e.g., child welfare, education) |
| TA evaluation attributes |
Type of evaluation Measurement approach (e.g., survey, interview) Type of data (e.g., qualitative, quantitative) Data perspective (e.g., subjective, objective) TA outputs (e.g., dosage, reach) TA outcomes (e.g., individual, organization) |
Fig. 1PRISMA flow diagram
Fig. 2Trend line of TA articles published between January 2000 and June 2020
Definitions of technical assistance (TA) used in TA evaluation research studies
| Article author(s) and year | Definition of technical assistance (TA) |
|---|---|
| Bonney et al. (2019) [ | A multi-tiered approach to build the capacity of individuals or organizations to achieve substantial change (as cited in Fixsen, Blasé, Horner, and Sugai, 2009 [ |
| Cerully et al. (2016) [ | Support to help community-partner organizations execute their efforts (as cited in Mitchell, Florin, [ |
| Chiappone et al. (2018) [ | TA is defined as targeted or tailored support given to an individual or organization to help assist with successful development, implementation, and evaluation of a program, policy, intervention, or service through shared knowledge, resources, and expertise (as cited in National Association for the Education of Young Children and National Association of Child Care Resource and Referral Agencies, 2011) [ |
| Chilenski et al. (2018) [ | TA involves external expertise and guidance designed to support the effective translation of EBIs into real-world settings (as cited in Forman, Olin, Hoagwood, Crowe & Saka, 2009 [ |
| Chilenski et al. (2016) [ | TA, or the support and assistance that a prevention effort receives from someone or some organization that is not a part of a community team, has been theorized as very important in supporting high quality implementation of prevention programs specifically, and prevention systems more generally (as cited in Chinman et al., 2005 [ |
| Duffy et al. (2012) [ | Individualized and hands-on intervention intended to address specific barriers in the context of a single individual or organization (as cited in Wandersman, Chien, & Katz, 2012 [ |
| Hunter et al. (2009) [ | TA has been used to describe different types of activities, including community-friendly manuals, on-site consultation, regional workshops, train-the-trainers models, and interactive Web-based systems (as cited in Stevenson, Florin, Mills & Andrade, 2002). |
| Livet et al. (2018) [ | Planned instructional activity to facilitate knowledge and skill acquisition (as cited in Leeman et al. 2015 [ |
| Moreland-Russell et al. (2018) [ | For the purposes of this work, we considered “TA” for HPV and CRC as a multicomponent strategy consisting of in-person sessions supported by subject matter experts, facilitated development of action plans by state team members, and follow-up support calls which included webinars with team members and partners that were involved in the implementation of the specific activities in their respective action plans. |
| Olson, et al. (2020) [ | TA has been defined as an individualized approach that provides implementation support to, and increasing capacity for, continuous quality improvement (CQI) among comparative effectiveness research. (as cited in Wandersman, Chien, and Katz, 2012 [ |
| Segre, O'Hara, Fisher (2013) [ | TA consultations are sessions in which practitioners and host organizations gain the information, tools and support to implement new practices (as cited in Sullivan 1991 [ |
| Spadaro et al. (2011) [ | Providing guidance, support, and expertise (as cited in Anderson, Bruner, & Satterfield, 1995) [ |
| Rushovich et al. (2015) [ | TA is a broad term that has been used to describe services that an outside entity provides to an agency or organization to help build its capacity to implement an innovation or improvement to their current operations (as cited in Sokol & Stiegert, 2010) [ |
| Yazejian, Iruka (2015) [ | On-site TA refers to any individualized professional development strategy that supports the application of skills to practice, such as coaching or professional development advising. |
| Young et al. (2020) [ | The formal or informal engagement of an entity to one or more additional entities for the purpose of improving their capacity to accomplish their public health objectives (e.g., training, resources) |
Reasons for and Frequency of Applications of TA
| Characteristic | Frequency (percent) |
|---|---|
| Implement EBI | 51 (41%) |
| Other (e.g., program development) | 26 (21%) |
| Combination | 25 (20%) |
| Evaluation capacity building | 9 (7%) |
| Improvement | 5 (4%) |
| Coalition building | 5 (4%) |
| Workforce development | 4 (3%) |
| Other | 46 (37%) |
| Substance use | 18 (14%) |
| Mental health | 15 (12%) |
| Public education | 13 (10%) |
| Child welfare and youth development | 13 (10%) |
| HIV prevention | 11 (9%) |
| Healthcare improvement | 7 (6%) |
| Housing | 2 (2%) |
| Combination | 62 (49%) |
| Coaching | 26 (21%) |
| Not specified | 23 (18%) |
| Other | 10 (8%) |
| Training | 4 (3%) |
Frequency of measurement approaches for assessing technical assistance
| Characteristic | Frequency (percent) |
|---|---|
| Summative | 90 (72%) |
| Process | 17 (14%) |
| Combination | 16 (13%) |
| Formative | 2 (1%) |
| Combination | 47 (38%) |
| Survey | 33 (26%) |
| Documentation review | 20 (16%) |
| Interview | 19 (15%) |
| Not reported | 4 (3%) |
| Natural observation | 2 (2%) |
| Quantitative | 64 (51%) |
| Mixed methods | 32 (26%) |
| Qualitative | 28 (22%) |
| Not reported | 1 (1%) |
| Subjective | 52 (42%) |
| Both | 48 (37%) |
| Objective | 26 (21%) |
Frequency of technical assistance output variables
| Characteristic | Frequency (percent) |
|---|---|
| Reported | 98 (78%) |
| Not reported | 27 (22%) |
| Combination | 68 (54%) |
| Not reported | 29 (23%) |
| In-person | 21 (17%) |
| Virtual/phone | 7 (6%) |
| Not reported | 34 (27%) |
| Combination | 32 (26%) |
| As needed | 28 (22%) |
| Fixed number | 21 (17%) |
| Routinely | 10 (8%) |
| Reported | 82 (66%) |
| Not reported | 43 (34%) |
| Not reported | 62 (50%) |
| Provider-initiated | 27 (21%) |
| Bi-directional | 25 (20%) |
| Recipient-initiated | 11 (9%) |
| Not reported | 103 (82%) |
| Reported | 22 (18%) |
Frequency of individual, organizational, and community level outcomes
| Characteristic | Frequency (percent) |
|---|---|
| | |
| Increased | 15 (12%) |
| Decreased | 7 (6%) |
| No change | 2 (1%) |
| Not reported | 101 (81%) |
| | |
| Increased | 14 (11%) |
| Not reported | 111 (89%) |
| | |
| Increased | 8 (6%) |
| No change | 1 (1%) |
| Not reported | 116 (93%) |
| | |
| Increased | 5 (4%) |
| Decreased | 1 (1%) |
| Not reported | 119 (95%) |
| | |
| Increased | 3 (2%) |
| Not reported | 122 (98%) |
| | |
| Increased | 2 (2%) |
| Not reported | 123 (98%) |
| | |
| Increased | 2 (2%) |
| Not reported | 123 (98%) |
| Organizational program | 21 (17%) |
| Combination of outcomes | 15 (12%) |
| Staff capacities | 5 (4%) |
| Organizational structure | 3 (2%) |
| Resource utilization | 3 (2%) |
| Other | 20 (16%) |
| Not reported | 58 (46%) |
| Reported | 20 (16%) |
| Not reported | 105 (84%) |
Summary of recommendations to advance the evaluation and effectiveness of TA
| Main Insight | Recommendation |
|---|---|
| 1. | Use a consensus method (e.g., Delphi Technique) which includes a panel of expert TA practitioners, researchers, and recipients to develop a standard definition of TA. Consider the following defining features of TA when establishing a standard definition: ○ Aim is to increase capacity ○ Services target the systems-level (organization, community) ○ Supports are targeted and tailored ○ Supports are provided by a subject matter expert or specialist |
| 2. | • Use more robust evaluation research designs (e.g., experimental designs) to identify causal links between TA implementation and outcomes. • Increase use of longitudinal study designs to understand the sustainability of TA. Include control and matching techniques to compare outcomes over time. • Consider approaches rooted in design research (formative experiments occurring in real-world settings) to examine downstream effects of TA. |
| 3. | • Use self-report measures to assess TA recipient attitudes and beliefs, particularly regarding TA satisfaction, self-efficacy, and commitment to change. • Prioritize the use of objective data to measure outcomes about knowledge, skills, behavior change, and system-level changes. • When feasible, use a mixed-methods approach to capture subjective and objective data to enable data triangulation. • Develop and use psychometrically sound instruments to assess TA. |
| 4. | • Use a TA logic model to guide the systematic documentation of TA inputs, processes, outputs, and outcomes. • Develop reporting standards for TA evaluation research studies. Consider the following items for a reporting checklist: ○ Provide an explicit conceptual and operational definition for TA. Upon availability, utilize a standard TA definition. ○ State the specific aim(s) and targeted direct and indirect outcomes for utilizing TA (e.g., to implement an evidence-based practice/intervention, coalition building, workforce development). ○ Provide detailed descriptions of TA activities (e.g., coaching, training, tools, combination), including data relating to core mechanics of TA (e.g., modality, reach, duration of engagement, directionality, frequency of contact). Additionally, describe the methods of measuring TA activities (e.g., measurement tools, procedures). ○ Where possible, report: i) the effect of specific TA activities to disaggregate attributions, in addition to the total effect, ii) both direct and indirect outcomes of TA and iii) longitudinal outcomes. |