| Literature DB >> 33436468 |
Joanne R Beames1, Raghu Lingam2, Katherine Boydell1, Alison L Calear3, Michelle Torok1, Kate Maston1, Isabel Zbukvic4, Kit Huckvale1, Philip J Batterham3, Helen Christensen1, Aliza Werner-Seidler5.
Abstract
INTRODUCTION: Process evaluations provide insight into how interventions are delivered across varying contexts and why interventions work in some contexts and not in others. This manuscript outlines the protocol for a process evaluation embedded in a cluster randomised trial of a digital depression prevention intervention delivered to secondary school students (the Future Proofing Study). The purpose is to describe the methods that will be used to capture process evaluation data within this trial. METHODS AND ANALYSIS: Using a hybrid type 1 design, a mixed-methods approach will be used with data collected in the intervention arm of the Future Proofing Study. Data collection methods will include semistructured interviews with school staff and study facilitators, automatically collected intervention usage data and participant questionnaires (completed by school staff, school counsellors, study facilitators and students). Information will be collected about: (1) how the intervention was implemented in schools, including fidelity; (2) school contextual factors and their association with intervention reach, uptake and acceptability; (3) how school staff, study facilitators and students responded to delivering or completing the intervention. How these factors relate to trial effectiveness outcomes will also be assessed. Overall synthesis of the data will provide school cluster-level and individual-level process outcomes. ETHICS AND DISSEMINATION: Ethics approval was obtained from the University of New South Wales (NSW) Human Research Ethics Committee (HC180836; 21st January 2019) and the NSW Government State Education Research Applications Process (SERAP 2019201; 19th August 2019). Results will be submitted for publication in peer-reviewed journals and discussed at conferences. Our process evaluation will contextualise the trial findings with respect to how the intervention may have worked in some schools but not in others. This evaluation will inform the development of a model for rolling out digital interventions for the prevention of mental illness in schools. TRIAL REGISTRATION NUMBER: ANZCTRN12619000855123; https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=377664&isReview=true. © Author(s) (or their employer(s)) 2020. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ.Entities:
Keywords: child & adolescent psychiatry; depression & mood disorders; preventive medicine
Year: 2021 PMID: 33436468 PMCID: PMC7805380 DOI: 10.1136/bmjopen-2020-042133
Source DB: PubMed Journal: BMJ Open ISSN: 2044-6055 Impact factor: 2.692
Figure 1Logic model. The model shows that CFIR constructs, including school context characteristics, school organisational characteristics and individual characteristics, will influence how staff engage with the implementation strategy. The intervention itself, which includes the core cognitive–behavioural therapeutic components, is conceptualised as standardised across individuals because it is delivered digitally, follows a fixed schedule and does not incorporate tailored content. The yellow input factors are expected to vary across schools and individuals, thus influencing engagement and flexibility of the implementation strategy and in turn, implementation outcomes and student-level outcomes. The logic model and implementation plan were externally peer reviewed by an experienced and internationally recognised implementation scientist outside the team within an implementation workshop. For details on assessment of these factors, see table 1. CFIR, Consolidated Framework for Implementation Research; FPS, Future Proofing Study.
Process evaluation details including process data, outcome data, data type and source
| CFIR and RE-AIM constructs | Research aim | Process or outcome data | Data type and source |
| Outer setting | |||
| School contextual characteristics | What was the broad context of the schools in which the SPARX intervention was delivered? (aim 2) | School socioeconomic index | Publicly available information (ICSEA, GPS) |
| Inner setting | |||
| School organisational characteristics | What were the characteristics of the delivery environments (schools)? (aim 2) | School size, type, composition, funding | Publicly available information (school size, type, funding) |
| School leadership | How supportive of delivering SPARX were school principals, deputy principals and executives? (aim 2) | Level of support and buy-in from school leaders | Interviews with school staff |
| Individual characteristics | |||
| School staff | What were the characteristics (including attitudes, beliefs, traits) of school staff supporting the delivery of the intervention? (aim 2) | Age, gender, current employment, role, etc | Demographics questionnaire |
| Study facilitators | How well did study facilitators attending schools support the delivery of the intervention? (aim 2) | Age, employment | Demographics questionnaire |
| Students | What were the characteristics of young people that affected intervention uptake and effectiveness? (aim 2) | History of mental illness | Reported by year 8 students as part of the online FPS survey |
| Intervention characteristics | |||
| SPARX | Were there any barriers to intervention use? (aims 2 and 3) | Technical issues | Logs of technical issues sent through schools, parents and participants |
| Implementation processes | |||
| Normalisation and integration | How did school staff perceive the implementation processes? (aim 2) | Coherence, cognitive participation and collective action | NoMAD |
| Fidelity to the implementation strategy | To what extent was the intervention implemented as planned? (aim 2) | School delivery of the FP programme, including changes to the plan | Completed implementation checklists, emails and feedback forms |
| Implementation outcomes | |||
| Reach | What was the extent to which those who were eligible to receive SPARX used it? (aim 1) | Proportion of eligible participants who consented to participate; proportion who opened, used and completed the SPARX intervention | Administrative data about consent Digital analytic data including usage (app downloads, installs, opens), completion rate (number of modules completed) and time spent using SPARX |
| Uptake | How many eligible schools participated in the study? Within those schools, how many staff supported the delivery of SPARX? (aim 1) | Proportion of eligible schools that were onboarded to the study; proportion of school staff (in intervention schools) who supported SPARX | Administrative data |
| Acceptability/appropriateness | How satisfied were participants with the intervention? (aim 1) | Acceptability/appropriateness of the intervention, expectations | Reported by year 8 students as part of the online FPS survey |
| Across domains | |||
| How might the relationship between the intervention, the staff supporting the programme and context of each school shape variation in outcomes (implementation strength metric)? (aims 2 and 3) | |||
The process data and outcomes are mapped onto figure 2.
CFIR, Consolidated Framework for Intervention Research; FPS, Future Proofing Study; GPS, Global Positioning System; ICSEA, Index of Community Socio-Educational Advantage; IT, information technology; NoMAD, Normalisation Measure Development questionnaire; RE-AIM, Reach, Effectiveness, Adoption, Implementation, Maintenance; SSPESH, Survey of School Promotion of Emotional and Social Health.
Figure 2Details of implementation strategy training and delivery structure.
Summary of data forms (and collection point) provided by each of the participant groups
| Participant group | Questionnaire | Individual interview | Digital analytics |
| Year 8 students | ✓ (Post-intervention) | ✓ (Ongoing) | |
| School staff | ✓ (Post-intervention) | ✓ (Post-intervention) | |
| Facilitators | ✓ (Before first school visit and after final post-intervention visit) | ✓ (After final school post-intervention visit) |