| Literature DB >> 30832626 |
Josie Dickerson1, Philippa K Bird2, Maria Bryant3, Nimarta Dharni2, Sally Bridges2, Kathryn Willan2, Sara Ahern4, Abigail Dunn5, Dea Nielsen6, Eleonora P Uphoff5, Tracey Bywater5, Claudine Bowyer-Crane6, Pinki Sahota4, Neil Small7, Michaela Howell8, Gill Thornton8, Kate E Pickett5, Rosemary R C McEachan2, John Wright2.
Abstract
Many interventions that are delivered within public health services have little evidence of effect. Evaluating interventions that are being delivered as a part of usual practice offers opportunities to improve the evidence base of public health. However, such evaluation is challenging and requires the integration of research into system-wide practice. The Born in Bradford's Better Start experimental birth cohort offers an opportunity to efficiently evaluate multiple complex community interventions to improve the health, wellbeing and development of children aged 0-3 years. Based on the learning from this programme, this paper offers a pragmatic and practical guide to researchers, public health commissioners and service providers to enable them to integrate research into their everyday practice, thus enabling relevant and robust evaluations within a complex and changing system.Using the principles of co-production the key challenges of integrating research and practice were identified, and appropriate strategies to overcome these, developed across five key stages: 1) Community and stakeholder engagement; 2) Intervention design; 3) Optimising routinely collected data; 4) Monitoring implementation; and 5) Evaluation. As a result of our learning we have developed comprehensive toolkits ( https://borninbradford.nhs.uk/what-we-do/pregnancy-early-years/toolkit/ ) including: an operational guide through the service design process; an implementation and monitoring guide; and an evaluation framework. The evaluation framework incorporates implementation evaluations to enable understanding of intervention performance in practice, and quasi experimental approaches to infer causal effects in a timely manner. We also offer strategies to harness routinely collected data to enhance the efficiency and affordability of evaluations that are directly relevant to policy and practice.These strategies and tools will help researchers, commissioners and service providers to work together to evaluate interventions delivered in real-life settings. More importantly, however, we hope that they will support the development of a connected system that empowers practitioners and commissioners to embed innovation and improvement into their own practice, thus enabling them to learn, evaluate and improve their own services.Entities:
Keywords: Early intervention; Integration; Pragmatic evaluation; Public health; Service evaluation; Systems change
Mesh:
Year: 2019 PMID: 30832626 PMCID: PMC6399808 DOI: 10.1186/s12889-019-6554-2
Source DB: PubMed Journal: BMC Public Health ISSN: 1471-2458 Impact factor: 3.295
The Better Start Bradford interventions
| Intervention | Description | Service Provider | Recipients per yr. (% BiBBS)b | Main Outcome / Domain | Evaluation planned | Proposed Method for Evaluation | ||
|---|---|---|---|---|---|---|---|---|
| Implement-ation | Before & After | Effective-ness | ||||||
| Antenatal Support | ||||||||
| Personalised Midwifery | Continuous midwife care | BDHFT Midwifery Services | 500 (300) | Maternal mental health | X | X | Propensity Score (Control BiBBS women receiving standard midwifery care) | |
| Family Links Antenatal | Universal antenatal parenting skills programme | Local Authority | 200 (120) | Maternal mental health (PHQ8) | X | X | Pre and post study of difference in main outcome for participants | |
| ESOL+ | English language course for women with little or no English during pregnancy | Shipley FE College | 90 (54) | Socio emotional / language | X | Validation of logic model | ||
| Antenatal & Postnatal Support | ||||||||
| Family Nurse Partnershipa | Intensive home visiting for vulnerable women aged < 25 | BDCT | 100 (60) | Monitoring only | National evaluation currently underway | |||
| Baby Steps | Parent education programme for vulnerable parents | VCS – Action For Children | 100 (60) | Parent Infant relationship | X | X | X | Propensity Score (Control BiBBS women receiving standard midwifery care) |
| Doula | Late pregnancy, birth and post-natal support for vulnerable women | VCS Action For Community Ltd | 82 (50) | Implementation | X | X | Implementation using monitoring data + interviews women | |
| HAPPY | Healthy eating & parenting course for overweight mums with a BMI over 25. | VCS – Barnardo’s | 120 (72) | BMI age 2 | X | X | X | Trial within Cohort (TwiCs) (Control: eligible BiBBS women not selected to rec’ HAPPY) |
| Perinatal Support Service | Perinatal support for women at risk of mild/moderate mental health issues | VCS – Family Action | 140 (84) | Maternal mental health (PHQ9) | X | X | Implementation Evaluation | |
| Postnatal Support | ||||||||
| Breast feeding (BF) support service | Universal practical and emotional support to breastfeeding mums and their families | VCS – Health For All (Leeds) | TBC | BF duration | X | Validation of logic model | ||
| Early Years Support | ||||||||
| Home-Start | Peer support for vulnerable women | VCS – Home-Start | 45 (27) | Socio-emotional | X | Validation of logic model | ||
| Little Minds Matter | Support and nurturing of parent-infant relationships for those at risk of relationship problems | BDCT/ Family Action | 40 (24) | Socio-emotional | X | Validation of logic model | ||
| HENRY | Universal group programme to improve healthy eating and physical activity in young children | VCS & Schools / HENRY | 186 (111) | BMI age 5 | X | X | X | Propensity Score (Control matched BiBBS women not attending HENRY) |
| Incredible Years Parentinga | Universal parenting programme for parents with toddlers | VCS – Barnardo’s | 160 (96) | Socio-emotional | X | X | X | Propensity Score (Control matched BiBBS women not attending) |
| Cooking for a Better start | Universal cook and eat sessions | VCS - HENRY | 72 (43) | Implementation | X | Validation of logic model | ||
| Pre-schoolers in the Playground | Pre-schoolers physical activity in the playground | Schools | 108 (65) | Physical activity /obesity | X | X | Trial within Cohort (cluster randomised) | |
| Forest Schools | Outdoor play in the natural environment for young children & parents | VCS – Get Out More CiC | 90 (54) | Physical activity /obesity | X | Trial within Cohort (cluster randomised) | ||
| Better Start Imagine | Book gifting & book sharing sessions | VCS – BHT Early Education and Training | 1015 (609) | Parent attitudes and behaviours @ 2 yrs | Validation of logic model for sharing session. Acceptability of book gifting in different cultures | |||
| I CAN Early Talk | Strengthening parents’ and practitioners’ knowledge in improving language development | VCS – BHT Early Education and Training | 115 (69) | Staff / parental knowledge | X | Implementation Evaluation | ||
| Talking Together | Universal screening for language delay of 2 year olds; in home programme for parents with children at risk of delay. | VCS – BHT Early Education and Training | 954 (572) | Language assessment at 3 month follow up | X | X | X | Trial within Cohort (Control: Waiting list comparison grp) |
aEvidence based interventions, all others are science based
bIntervention participation figures are based on current service design. BiBBS participation figures are based on 60% recruitment rate. Actual numbers may vary
A summary of the challenges, their causes and strategies to resolve them
| Challenge | Possible Causes | Strategies |
|---|---|---|
| 1. Researchers, Communities and Stakeholders often have different priorities and timeframes of research outputs. |
| • Identify and involve relevant communities and stakeholders at all stages. |
| 2. It can be difficult to accommodate the requirements of the evaluation, implementation and delivery of the intervention within service design. |
| • Use and adapt the toolkits presented in this paper to aid service design and ensure the needs of commissioners, providers and evaluators are all considered in a structured and efficient way. |
| 3a. There may be gaps in the collection or entry of routine data that are required for evaluation. |
| • Develop training sessions and manuals for practitioners to empower them to collect data that is useful for research. |
| 3b. Services may use non-validated measures to assess outcomes. |
| • Co-production / selection of validated measures involving practitioners, service providers, community members and researchers. |
| 3c. Organisations may be concerned about sharing data. |
| • Building of good relationships with key stakeholders. |
| 4. It may be difficult to easily identify early successes and challenges in intervention implementation. |
| • Use the toolkits presented in this paper to ensure the right data is collected. |
| 5. Service providers and commissioners are pressured to find quick answers, but rigorous evaluation can take much longer. |
| • Use the evaluation framework presented in this paper to set expectations, ensure that the necessary groundwork is completed and answer important implementation questions before embarking on effectiveness evaluations. |
Fig. 1The Better Start Bradford Innovation Hub process of integrating research into practice
Fig. 2An example of the service design toolkit
Challenges of routine data: An example from maternal mental health data
| National Institute for Health and Clinical Excellence (guidelines [1] recommend that the Whooley questions [2] are completed to assess maternal mental health and a full mood assessment completed if the woman answers positively. In the health data system in Bradford we discovered that the code for Whooley questions is present if the questions were asked, but it doesn’t record the response to the questions. This is very challenging for evaluations because we can only assume the outcome of the assessment by subsequent actions, e.g. if no other action was taken we assume a negative response to Whooley, but this might not be the case. |
Implementing validated objective outcomes into routine practice
| In Bradford Health Visitors complete a 3–4 month visit to assess the mother-child relationship. The National Institute for Clinical Excellence guidance in the UK [1] recommends that the mother-child relationship is assessed but doesn’t recommend any particular measure for use with babies and consequently the assessment in Bradford (and other places across the UK) is based on subjective observations. To allow us to evaluate the impact of interventions on attachment we needed to implement an objective validated measure. |
Co-production of validated and acceptable outcome measures
Using routine data to inform practice and policy
| Bradford health and education organisations use local data to inform their planning and their work. The Better Start Bradford programme has encouraged a breakdown of data at ward level and a search for more up to date local data. In the past ethnicity prevalence have been taken at a City-wide population level from the UK Census completed in 2011. The Better Start Bradford work allowed us access to maternity records that indicated a different ethnicity prevalence for pregnant women and for young children in the Better Start Bradford areas than that reported in the Census. Similarly the maternity data highlighted that one-third of pregnant women had little or no English. This has informed practice across the City and has led to a focus on enhancing service provision and accessibility for these women within the service design and monitoring processes. |
An example of the benefits of using progression criteria
| One of the Better Start Bradford interventions is a locally developed project that offers a universal language screening of two-year olds in the Better Start Bradford area, and an in-home intervention for those identified as at risk of language delay. The progression criteria were agreed with the service provider and commissioner. Early review of these criteria revealed a higher demand for the in-home intervention than originally anticipated. This encouraged early review of the capacity and resources for the project to ensure successful delivery. The reach criteria indicated challenges in engaging one particular ethnic group, which encouraged the service provider to focus engagement activities with this group and also ensure interpreting services were available. |
An example of a staged approach to evaluation
| One of the Better Start Bradford interventions is a personalised midwifery model adapted from the evidence based continuity of care model [1]. The adaptation was the removal of continuity at delivery due to local concerns of high burden on midwives. The removal of a key component means there is no evidence of implementation or effect for this intervention. The first stage of evaluation that we undertook was an implementation evaluation to look at the feasibility, fidelity and acceptability of the model using midwifery data, complemented with structured interviews with midwives and women who had recent midwifery care. The implementation results helped us to demonstrate that the intervention was feasible and acceptable and also helped to identify the key components and outcomes that can be rolled out to other midwifery teams in the area. The next step will be an effectiveness evaluation using routinely collected data to explore the benefits of continuity of care without the birth element. This evaluation will use propensity score matching within the BiBBS cohort. |