Literature DB >> 33004395

Protocol for a process evaluation of Family Planning Elevated: a statewide initiative to improve contraceptive access in Utah (USA).

Jami Baayd1, Rebecca G Simmons2.   

Abstract

INTRODUCTION: Many individuals in the USA do not have access to the contraceptive methods they desire. Contraceptive initiatives have emerged at the state and national levels to remove barriers to access, and many initiatives have reported success. Other initiatives may want to build on or replicate that success, but data are scarce on the details of how and why certain interventions work. This paper describes the protocol for the planned process evaluation of Family Planning Elevated (FPE), a statewide contraceptive initiative in Utah.
METHODS: FPE will conduct a process evaluation during the planning and implementation phases of the programme. The process evaluation will document (1) the community, state and national contexts in which the programme is implemented, (2) how FPE is implemented and (3) the mechanism by which FPE creates impact. We will collect qualitative data via interviews with FPE staff, providers and staff participating in the programme, and key stakeholders and policy-makers throughout the state. The team process evaluator will record FPE decision making and implementation activities by taking field notes during weekly FPE meetings. Quantitatively, we will collect monthly data reports from FPE-participating clinics, analytics reports from the media campaign and survey results from patients in FPE-participating clinics. The findings of the process evaluation will allow other contraceptive initiatives to learn from FPE's efforts and replicate successful components of the programme. ETHICS AND DISSEMINATION: The study received approval from the University of Utah's Institutional Review Board. Findings from the process evaluation and outcome evaluation will be published, shared with other contraceptive initiatives and presented at conferences. TRIAL REGISTRATION NUMBER: NCT03877757. © Author(s) (or their employer(s)) 2020. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ.

Entities:  

Keywords:  Contraception; family planning services; process evaluation

Mesh:

Substances:

Year:  2020        PMID: 33004395      PMCID: PMC7534679          DOI: 10.1136/bmjopen-2020-038049

Source DB:  PubMed          Journal:  BMJ Open        ISSN: 2044-6055            Impact factor:   2.692


To the authors’ knowledge, this is the first published protocol describing the process evaluation of a US-based contraceptive initiative. The process evaluation will collect qualitative and quantitative data from many sources, including programme implementers, participants, policy-makers, community board members and patients. Integration of the implementation and evaluation teams may improve the quality of the intervention and increase transparency in reporting, but may bias findings of the process evaluation. Elements of Family Planning Elevated will evolve during implementation, thus, adherence to the original implementation plan may be variable.

Introduction

Ensuring every individual has access to high-quality, person-centred contraceptive care is both a public good1–7 and a protection of human rights.8–11 While data on the benefits of contraceptive access grows, barriers remain for many individuals who are seeking contraceptive care.12–18 An increasing number of contraceptive initiatives have emerged in the USA, each with the goal of removing barriers and improving access.19–23 While there is considerable variance about the best way to measure success in contraceptive initiatives,24 25 outcome evidence from several initiatives suggests that their efforts have made an impact19–23 26–28 which other initiatives may hope to replicate. However, replicability of these complex interventions is difficult from outcome data alone. For the work of contraceptive initiatives to be replicable, they must be able to share not only if an intervention was successful, but also how and why it was successful. An evaluation of how an intervention is implemented in various settings, how participants react to and interact with an intervention, and the context in which an intervention occurs is crucial to a robust understanding of the success or failure of complex interventions.29–36 This type of evaluation—called a process evaluation—allows researchers and implementers to identify key mechanisms for the success or failure of their intervention,37–40 and allows those outside the project to determine if a similar intervention might be successful in their setting.41 42 This paper describes the planned protocol for a process evaluation of a statewide family planning initiative in Utah. While several contraceptive initiatives have provided in-depth descriptions of key components of their implementation process,23 27 43 to our knowledge, this is the first contraceptive initiative in the USA to publish a formal process evaluation.

Family planning elevated initiative

Family Planning Elevated (FPE) is a statewide initiative with the goal of increasing individuals’ access to high-quality, person-centred family planning services. Using the tenets of the Human Rights-based Approach to Family Planning,8 FPE aims to accomplish this goal by targeting at four levels: Individual level: Launch a marketing and education campaign to direct patients to where they can access free contraceptive care (either through FPE or Medicaid) and educate individuals on the full range of contraceptive strategies from which they can choose. Community level: Form a Reproductive Justice Advisory Board (RJ CAB) to ensure FPE understands the needs of historically underserved populations and is addressing access barriers specific to those populations. Clinic level: FPE will enrol three cohorts of clinical organisations into the FPE Contraceptive Access Programme (FPE CAP). FPE CAP clinics will receive cash grants for personnel, equipment and supplies. Clinics will receive reimbursement for all contraceptive services and methods provided to individuals whose declared income is between 101% and 250% of the federal poverty line (FPL) (an income measure developed by the US government to determine eligibility for federal and state programmes aimed at supporting low-income people), and for any undocumented individuals (those who reside in the USA without legal residency) with incomes under 250% of the FPL. Providers and staff at FPE CAP clinics will receive training and technical assistance on contraceptive care. All FPE CAP clinics will be enrolled in the programme for at least 2 years. Policy level: FPE will support existing and emerging legislative policy that expands family planning service to individuals in Utah. A primary goal of FPE is to demonstrate the unmet need for contraceptive coverage among those who fall in the ‘contraceptive coverage gap’ (un or underinsured, including individuals between 100% and 250% federal poverty level). FPE launched in January of 2019 and will continue through early 2023. Enrolment of FPE CAP clinics is anticipated to end by December 2020.

Methods

Process evaluation objectives

FPE is a complex intervention that will be implemented into varied and multifaceted healthcare organisations. Following the recommendations of the Medical Research Council29 regarding complex interventions, our process evaluation aims to accomplish the following objectives (see figure 1 for key components of the process evaluation):
Figure 1

FPE process evaluation objectives: context, implementation and mechanisms of impact. FPE, Family Planning Elevated; FPE CAP, FPE Contraceptive Access Programme.

FPE process evaluation objectives: context, implementation and mechanisms of impact. FPE, Family Planning Elevated; FPE CAP, FPE Contraceptive Access Programme. Understand both the state and local context in which FPE operates, and how that context impacts the intervention (as well as how the intervention impacts the context). Document FPE as it is implemented. Explore how the actual implementation of FPE differs from the planned implementation, and identify where changes were an intentional adjustment to better meet programme needs (innovation) and where they were ‘unintentional drift’35 from the plan. Identify the programme’s mechanisms of impact (how did clinicians and clients interact with FPE’s programming, were clients satisfied with care at FPE CAP clinics, what were the facilitators and barriers to implementation). For the purpose of this process evaluation, we will consider policy-makers, FPE CAP clinic staff and the clients they serve to be recipients of the intervention. Our intervention is aimed at improving contraceptive access to the public; however, our programmatic approach to this improvement is mainly conducted through health systems, with clinics and providers as proximal recipients of FPE activities, while clients are distal beneficiaries of the programme. As such, we consider clinics, providers and policy-makers as recipients of the intervention, rather than intermediaries. FPE’s logic model (see figure 2) shows the theoretical path towards FPE’s ultimate goal of improving contraceptive access throughout the state. The process evaluation will document FPE’s adherence to the activities detailed in the logic model (implementation) as well as test the assumptions that those activities will lead to the documented outputs and outcomes (mechanisms of impact).
Figure 2

Family Planning Elevated’s (FPE) logic model. FPE CAP, FPE Contraceptive Access Programme.

Family Planning Elevated’s (FPE) logic model. FPE CAP, FPE Contraceptive Access Programme.

Design

FPE’s evaluation team consists of the director of evaluation who will oversee both the process and outcome evaluations; a data analyst researcher who will oversee the data analysis for the quantitative portion of the process evaluation and the outcome evaluation; and a process evaluator who will conduct and analyse the qualitative portions of the process evaluation, as well as integrate the quantitative portions for final analysis of the process evaluation. The implementation team is composed of a programme director, a project facilitator, an FPE CAP programme manager, a clinical training specialist and a medical director. FPE also has a research and policy advisor who is part of both the implementation and evaluation teams. While the evaluation team is distinct from the implementation team, they will not be passive observers in the implementation process. The evaluation and implementation teams have opted to create feedback loops between the evaluation and implementation teams which will allow findings from the process evaluation to inform the ongoing implementation. The evaluation team will meet regularly with the implementation team. While the information will typically flow from the implementation team to the evaluation team, information regarding process indicators and intermediate clinic data will also be shared in the opposite direction. Before beginning the process evaluation, FPE implementers and evaluators developed a foundational logic model to represent the projected activities (figure 2). This logic model charts how FPE aims to improve access to family planning by targeting efforts at each of the levels described in the Voluntary, Human Rights-based Approach to Family Planning Framework.8 At project midline, and again at end line, the team will revisit the logic model and make any required changes to represent how the project’s evolution. This will allow the team to visualise how implementation shifted, as well as present the most accurate logic model at project end.

Data collection and analysis

We will collect and analyse both qualitative and quantitative data for this process evaluation. Figure 1 diagrams the main components of the process evaluation, and how they fit into the larger programme evaluation. Table 1 details how our team will collect data on each of the primary components of the process evaluation.
Table 1

Data collection

ObjectiveMeasureResearch questionsData sourceRecords kept
Understand the context in which FPE was implementedCommunityWhat is the local context of sexual and reproductive health (SRH) in the communities served by FPE? How does the context impact implementation?Reproductive Justice Advisory Board (RJ CAB) meeting minutesRJ CAB Focus GroupsHER Salt Lake Contraceptive Initiative21 ongoing data collectionFieldnotes capturing SRH contextTranscripts of audio recordingsFindings from 3 years prospective surveys and qualitative interviews
StateWhat is the context of sexual and reproductive health in Utah? How does the context impact implementation?Statewide polling and surveillance data of sexual and reproductive health in UtahSurvey responses from FPE stakeholders regarding the state of SRH in UtahMeetings with state policy-makers and advocatesInterviews and informal conversations with FPE Research and Policy AdvisorFact sheets on Utah PolicyEmailed responses indexed in secure cloud content platformFieldnotesTranscripts of audio recordings and fieldnotes
NationalWhat is the nationwide SRH context? How does the context impact implementation?Policy analysis of SRH in the USACommunications with national partnersArticles and policy briefs regarding current SRH policyEmailed responses indexed in secure cloud content platform
Understand the intervention as implementedFidelityTo what extent were the essential elements of FPE delivered as intended?What were the reasons behind any changes made to the implementation plan?Were changes intentional adaptations or unintentional drift?FPE staff meeting minutesInterviews with FPE StaffApplications submitted by clinics when they apply to FPE CAP and quarterly clinic updatesFPE CAP member staff (administrative and clinical)Fieldnotes organised according to periodic reflections codebookTranscripts of audio recordingsCompleted applications and checklists from quarterly update callsTranscripts of audio recordings
DoseWhen and how often were the components of FPE implemented?Did the number of trainings vary among sites?How often and where did the media campaign run ads?Programme Management Software (Trello) ActivitiesTraining ReportsMedia analytics reportEnd-of-month summary of implementation activitiesDetails about when training occurred, training topics and locationDetailed analytical data regarding media coverage
AdaptationHow did the study team change the intervention to meet the context needs?Were the adaptations harmful, neutral, or helpful to the success of the implementation?FPE staff meeting minutesInterviews with FPE staffGroup interviews with FPE CAP clinic staffLog frameFieldnotes coded to specific adaptation codesTranscripts of audio recordingsTranscripts of audio recordingsMonthly changes to logframe recorded as ‘tracked changes’
ReachHow many people did our intervention reach?Media Analytics ReportMonthly data reportsTraining ReportsData on how many people interacted with our media campaignClinic data indicating number of clients servedAttendance records for each FPE training
Understand the mechanisms of impactHow participants react to, and interact with, FPEHow do FPE CAP clients feel about the contraceptive care they received?How does the FPE CAP clinic staff feel about participation in the programme?Client Exit SurveysIndividual interviews with FPE CAP members (clinical and administrative staff)Transcripts of audio recordingsTranscripts of audio recordings
MediatorsWhat factors of the intervention either helped or hindered implementation?Interviews with FPE StaffGroup interviews with FPE CAP clinic staffTranscripts of audio recordingsTranscripts of audio recordings
Unexpected effectsFPE Staff Meeting MinutesClinic Monthly Data ReportsExit interviews with FPE CAP clinic staffFieldnotes organised according to periodic reflections codebookMonthly data input into spreadsheetTranscripts of audio recordings

FPE, Family Planning Elevated; FPE CAP, FPE Contraceptive Access Programme.

Data collection FPE, Family Planning Elevated; FPE CAP, FPE Contraceptive Access Programme. Table 1 details the types of data that will be collected for each of the process evaluation objectives. Table 2 describes in detail how we will analyse each of the types of data.
Table 2

Data analysis

Data sourcesHow we process the information
Qualitative data
Notes from FPE team meetingsReal-time coding of the data using periodic reflections categories
Interview with individual members of FPE staffCode interview transcripts according to periodic reflection categories
Team Trello Board CalendarRunning log of team activities
Exit interviews with FPE CAP clinic staffInductively code transcripts to identify emerging themes
Process evaluation questions via email to FPE stakeholders and partnersInductively code email responses to identify emerging themes regarding FPE context and partner perceptions
Quarterly update calls with FPE CAP ClinicsChecklists of process indicators will be used for both process and outcome evaluations
Process evaluation interview with FPE CAP clinic staffDeductively code interview transcripts using modified CFIR code book (see description in mechanisms of impact section) to understand context, barriers and facilitators to implementation, and impact
Monthly process evaluation reportsCompiled by the process evaluator these reports will summarise several data sources
Monthly revisions to log frameEvaluation team will review programme log frame to update any changes to process or outcome indicators being collected. Changes will be tracked to create a monthly snapshot of project changes.
Focus group interview with RJ CABDeductively code interview transcripts to identify emerging themes regarding context, barriers and facilitators to implementation for specific populations
Sexual and Reproductive Health Policy documents: drafted resolutions, white papers, policy briefsDocuments related to sexual and reproductive health policy in Utah will be indexed in a secure cloud content folder and time stamped to create an archive of evolving policy context
Quantitative Data
Client exit surveysSurveys completed by clients at FPE CAP clinics will be used for both the process and outcome evaluations. Indicators regarding client satisfaction will be used to measure mechanisms of impact for process evaluation.
Utah BRFSS survey dataFour questions were proposed and added to the state-wide Behavioural Risk Factor Surveillance Survey, which is a weighted survey of health behaviours. The new questions asked individuals about contraceptive access, experiences of contraceptive counselling, and how they pay for their contraception. Survey responses will be analysed to understand how contraceptive access changes during the course of the intervention.
Media analytics reportAnalytical data from the media campaign (eg, cost-per-valued-view, view-through rates, positive earned media, platform engagement) will be collected to determine the number of individuals reached by the campaign, and the effectiveness of the messaging.
Service delivery dataClinics provide FPE with a monthly service delivery data report obtained through their electronic health records systems. This deidentified data include all contraceptive service codesand will allow FPE to assess level and trend changes in contraceptive service delivery over time.

BRFSS, Behavioural Risk Factor Surveillance System; CFIR, Consolidated Framework for Implementation Research; FPE, Family Planning Elevated; FPE CAP, FPE Contraceptive Access Programme; RJ CAB, Reproductive Justice Advisory Board.

Data analysis BRFSS, Behavioural Risk Factor Surveillance System; CFIR, Consolidated Framework for Implementation Research; FPE, Family Planning Elevated; FPE CAP, FPE Contraceptive Access Programme; RJ CAB, Reproductive Justice Advisory Board.

Objective 1: context

Understand both the state and local context in which FPE operates, and how the context influences the intervention (as well as how the intervention influences the context). FPE plans to support a wide variety of clinics including those in both rural and urban areas, clinics who primarily serve undocumented individuals, and clinics run by county health departments. To ensure our process evaluation captures each community’s unique context, the process evaluator will attend regular meetings of Utah’s RJ CAB. RJ CAB was formed in 2019 and its members are reproductive justice advocates and individuals from historically underrepresented communities. FPE selected the chairs and board members from a pool of interested applicants. Existing RJ CAB chairs will select board members for subsequent years of participation. RJ CAB members will provide community-level context data and highlight the needs of historically underserved groups. The process evaluator will take fieldnotes during the regular RJ CAB Meetings and conduct focus groups with the board members at project baseline, midline and end line. To monitor the state-level sexual and reproductive health (SRH) context, our process evaluator will establish a close working relationship with FPE’s Research and Policy Advisor. The Research and Policy Advisor will meet frequently with Utah’s policy-makers and community advocates to consult and provide data for SRH policies that support reproductive autonomy, access to contraception, and education about the range of contraceptive methods. Additionally, all reports, white papers, internal analysis and notes from stakeholder meetings will be shared with the process evaluator. These policy documents will be stored in a secure cloud-based platform, and indexed with the date they were created, to create a timeline of changes in the political landscape. The process evaluator will send a short list of questions to FPE stakeholders and partners identified by the FPE implementation staff. These questions will ask stakeholders and partners to identify barriers and potential solutions to SRH in Utah and their communities, as well as their opinions on FPE’s role. These questions will be sent at baseline, midline and end line of the intervention and responses will be indexed. FPE will look to the National Family Planning and Reproductive Health Association (NFPRHA) and the Society of Family Planning (SFP) as guides to understanding the national context of SRH. NFPRHA works to improve the quality of family planning services throughout the country and supports administrators and providers to deliver quality care and advocate for improved healthcare policy. The SFP funds and disseminates high-quality research around best practices in contraceptive and abortion care. The qualitative date for the context objective will include meeting notes, interview transcripts and email responses which will be uploaded into qualitative analysis software and coded deductively, according to predetermined process evaluation themes such as contextual barriers and facilitators to implementing FPE, how FPE research influenced policies around SRH, how other programmes working towards similar goals affect and are affected by FPE, and trends in the political landscape surrounding SRH. The results of the qualitative analysis will be combined with the literature reviews, policy briefs and resolution documents to create a comprehensive picture of the SRH contexts. Quantitatively, FPE will analyse findings from Utah’s Behavioral Risk Factor Surveillance System (BRFSS). In 2018, the Family Planning Division at the University of Utah submitted four family planning questions for inclusion in BRFSS. Those questions first appeared in the 2019 survey, and we hope to include them through 2024. Responses to the BRFSS questions will be analysed by FPE’s data analyst and included in the process evaluation report to understand the larger context in which FPE is implemented. For additional information on FPE’s use of BRFSS data, and information about the outcome evaluation more broadly, see FPE’s outcome evaluation protocol (Rebecca G. Simmons et al. Evaluating a longitudinal cohort of clinics engaging in a contraceptive access initiative in Utah: The Family Planning Elevated Contraceptive Access study protocol, under review).

Objective 2: implementation

Document how FPE is implemented. Explore how the actual implementation of FPE differed from the planned implementation. The logic model (figure 2) outlines FPE’s primary activities. The activities are designed to address contraceptive access at the individual, community, clinic and policy level. Because of the complex nature of this intervention, and the need to tailor decisions to the individual needs of each clinic and community, the actual implementation will inevitably differ from the planned implementation. The goal of the implementation objective of the process evaluation is to understand what elements of the intervention are delivered (and to whom and how often), where and why changes are made to the implementation plan, and how the plan was adapted to fit the changing needs of the project. To capture the programme implementation as it unfolds, we will collect both qualitative and quantitative data. The qualitative data on the programme implementation will include minutes from weekly FPE staff meetings, transcripts of interviews with FPE staff, summaries of monthly activities on FPE team calendar, monthly changes to project’s log frame, and narrative summaries from training reports. Quantitative data on the programme implementation will include monthly data reports from FPE CAP clinics showing the number of contraceptive services provided, quantitative responses to the applications submitted by FPE CAP members, progress indicators from the FPE CAP quarterly calls, quantitative aspects of training reports and media analytics reports. The process evaluator will analyse the qualitative implementation on an ongoing basis. The fieldnotes from weekly team meetings will be coded during or immediately following the meetings, with any clarifying questions asked to the team members. The meeting notes will be organised into the codes derived from Finley et al’s article on periodic reflections as a tool for process evaluations.44 Monthly interviews with a member of the FPE staff will also follow the template of these periodic reflections. The clustered codes from the field notes and FPE staff interview for each month will be combined into a monthly process evaluation report along with the summaries of the month’s implementation activities and the revised logframe for that month. Portions of the implementation process will also be coded and analysed using constructs from the Consolidated Framework for Implementation Research (CFIR).45 The CFIR constructs will be modified to fit FPE’s programme (see the mechanisms of impact section for detail on analysis using CFIR). The monthly reports will also incorporate the quantitative data collected on implementation. Where the qualitative data best answers the questions regarding fidelity and adaptation, the quantitative data will illuminate the dose and reach of FPE’s activities. The question of dose will be answered by information from the FPE CAP quarterly calls (questions such as: how many of your staff have been trained to insert IUDs or implants? Or, have you recently been out of stock of any methods?) which will be compared with the responses on the initial FPE CAP applications to track changes in those process indicators over item. The media analytics reports will also help us track how often and where the media campaigns have run. To measure how many people our intervention reaches we will track the number of individuals receiving services at FPE CAP clinics each month (from each’s clinic’s monthly data report), the number of views/interactions with the media campaign each month (from media analytics report), and the number of individuals who attended any FPE CAP trainings for that month (from training reports).

Objective 3: mechanisms of impact

Identify the programme’s mechanisms of impact (how did clinicians and client interact with FPE’s programming, were clients satisfied with care at FPE CAP clinics, what were the facilitators and barriers to implementation). The goal of objective three is to understand what happens when FPE is implemented, to learn how clinicians and clients responded to the programme. To answer this question, the process evaluation will focus on the perceptions of those who received the intervention (FPE CAP staff, clients served at FPE CAP clinics, individuals who interacted with the media campaign and policy-makers) rather than the perspectives of those who implemented the intervention (which was the focus of objective two). Quantitatively we will collect responses from the client exit surveys, to measure how clients who visit FPE CAP clinics feel about the care they receive. These exit surveys will be administered for month-long periods several times during the intervention. The surveys will be offered, by clinic staff, to all clients who receive contraceptive care at FPE CAP clinics. We will also compile data from each clinic’s monthly data reports, tracking the number of patients served at each clinic. Qualitatively, we will collect transcripts from interviews during implementation (individual interviews and focus groups) with FPE CAP clinic staff, as well as conduct exit interviews with FPE CAP members at the end of their clinic’s enrollment in the programme. All FPE CAP clinical and administrative staff will be invited to attend focus group interviews, including care providers (physicians, nurse practitioners and physician assistants), nurses, medical assistants, on-site pharmacists, clinic administration and management and front desk staff. While all will be invited, attendance is not mandatory, thus introducing the possibility of self-selection bias. Our documentation of the focus groups will include records of both who was present, and also who was missing, from each group to aid us in assessing the degree of potential bias. Additionally, we will use the same interviewer (JB) throughout the study, to increase trust from our clinics and thus, attempt to reduce social desirability bias throughout our interviews. We anticipate that the extent and impact of biases will be thoroughly assessed and addressed in our process evaluation outcome paper. The transcripts from baseline and year 1 interviews with each clinic, as well as interviews with FPE staff members, will be coded using modified CFIR constructs.45 See online supplemental appendix A for a list of the CFIR constructs revised for FPE’s process evaluation and how those constructs align with the process evaluation. Once coded, the constructs from each clinic will be analysed using the process detailed by the CFIR website (https://cfirguide.org/evaluation-design/qualitative-data/), allowing us to view barriers and facilitators to implementing FPE across each implementation site. We will analyse the quantitative data to determine trends in the number of contraceptive visits at FPE CAP clinics, as well as the experiences of contraceptive clients. By comparing these findings to the patterns regarding implementation, we will be able to identify which components of the intervention led to the greatest change, and which components did not create the expected impact. Additionally, by linking what was implemented with how the participants interacted and reacted, we can understand the barriers and facilitators to implementation and answer questions such as: why did clinic A observe an increase of patients after FPE was implemented, but the number at clinic B actually decreased after implementation?

Ethics and dissemination

The study received approval from the University of Utah’s Institutional Review Board. The service delivery data FPE collects from FPE CAP sites is deidentified before the FPE receives the data, so consent is not required and this component of the study was deemed exempt by the Institutional Review Board. Before every process evaluation interview, the researcher will explain the purpose of the process evaluation, how data from the interview will be used, and the protections for the interviewees’ privacy. Interviewees will be assured of their right to refuse to participate in the interview, or to avoid answering any questions. Those who do not wish to be recorded will be offered the right to do so. Findings from the process evaluation and outcome evaluation will be published in peer-reviewed journals, shared with other contraceptive initiatives and presented at conferences. The project received product donations from several pharmaceutical companies. These donations were not tied to research or other development aspects of the intervention and corporate funders did not have any input in project or evaluation design. As the project provides no-cost access to all reversible methods, donated products represent significant financial support for the programme, but do not translate into downstream influence for clinics or clients.

Patient and public involvement

Family planning care is an area of healthcare that is particularly personal and incredibly individualised for each patient. The decision to conduct a process evaluation, and the methods we selected to conduct that evaluation, was largely motivated by a desire to understand the experience of patients receiving care in FPE CAP clinics. While patients were not involved in the design of the process evaluation, nor in the recruitment, we will involve patients as we collect and disseminate our results. Specifically, we will share results of the process evaluation with members of Utah’s RJ CAB. The board members will provide feedback on the results, help contextualise them from the viewpoint of some of Utah’s marginalised communities and help direct dissemination activities.

Discussion

To our knowledge, this is the first published protocol describing the process evaluation of a comprehensive contraceptive initiative. If FPE is successful in achieving its goals of expanding contraceptive access in Utah and informing state-level policy, then the findings from the process evaluation will act as a ‘how-to’ guide for other states looking to implement similar programmes. This map of programme implementation and impact will allow FPE evaluators to share important lessons on which elements of FPE were most successful, which assumptions proved faulty, and how implementation and impact varied across clinics. Finally, the process evaluation will document the consensus-building process by FPE staff around revising the programme logic models during the course of the project. By documenting the logic model at the beginning, middle, and end of the project we will have a solid diagram of how the project evolved over time.

Strengths

FPE has a well-resourced evaluation team, with an evaluation director, a data analyst and a process evaluator. The evaluation team works closely with the implementation team and FPE Directors, which means they have full access to the day-to-day information and decisions that are important to document for the process evaluation. The integrated team and the decision to create feedback loops from the process evaluation to the programme implementers, means that learning from the process evaluation will be used to inform implementation thereby improving the programme’s effectiveness over time. Additionally, because FPE is being implemented in clinics that represent a wide variety of organisational styles, client populations, size and location, the process evaluation will be able to track and measure responses in a range of settings. Finally, FPE’s established relationships with policy-makers and key political figures in Utah means our process evaluation will include a well-informed understanding of the state and local context of our implementation.

Limitations

While the integration of the evaluation team with the implementation team may lead to improved programme implementation, it may also cause the findings of the process evaluation to be biased as the lines between implementation and evaluation could become blurred. According to the Medical Research Council’s recommendations33 there are significant pros and cons to integrating the evaluation and implementation teams. The guidelines advise that the evaluation and implementation teams decide on the desired level of integration ahead of the programme’s start date, and clearly describe the relationship between the two teams when presenting the results of the study. Another limitation is that many of FPE’s programme elements and design are not fully formulated at the programme outset. These as-yet-unformulated elements include how and when FPE’s media campaign will be delivered, the type and frequency of training each clinic will receive, and specific selection criteria for each cohort of clinics. Because these elements are still being developed, it may be difficult for the process evaluation to measure deviation from the original plan, since some of the plan is still being written during implementation. This is an unavoidable complication, however, because FPE must be flexible enough to respond to changes in the sexual and reproductive landscape that occur during implementation. While the evolving nature of the intervention may add some complexity to its evaluation, it also means the process evaluation is of particular importance in distinguishing why changes to the intervention occur, and if those changes were helpful or harmful in the success of the programme.
  29 in total

1.  Giving women the power to plan their families.

Authors:  Bob Carr; Melinda French Gates; Andrew Mitchell; Rajiv Shah
Journal:  Lancet       Date:  2012-07-10       Impact factor: 79.321

2.  Use of human rights to meet the unmet need for family planning.

Authors:  Jane Cottingham; Adrienne Germain; Paul Hunt
Journal:  Lancet       Date:  2012-07-10       Impact factor: 79.321

3.  Maternal deaths averted by contraceptive use: an analysis of 172 countries.

Authors:  Saifuddin Ahmed; Qingfeng Li; Li Liu; Amy O Tsui
Journal:  Lancet       Date:  2012-07-10       Impact factor: 79.321

4.  Essential components of public health evidence reviews: capturing intervention complexity, implementation, economics and equity.

Authors:  E Waters; B J Hall; R Armstrong; J Doyle; T L Pettman; A de Silva-Sanigorski
Journal:  J Public Health (Oxf)       Date:  2011-09       Impact factor: 2.341

5.  Fifty Years of Family Planning: New Evidence on the Long-Run Effects of Increasing Access to Contraception.

Authors:  Martha J Bailey
Journal:  Brookings Pap Econ Act       Date:  2013

6.  The Contraceptive CHOICE Project: reducing barriers to long-acting reversible contraception.

Authors:  Gina M Secura; Jenifer E Allsworth; Tessa Madden; Jennifer L Mullersman; Jeffrey F Peipert
Journal:  Am J Obstet Gynecol       Date:  2010-06-11       Impact factor: 8.661

7.  Reductions in pregnancy rates in the USA with long-acting reversible contraception: a cluster randomised trial.

Authors:  Cynthia C Harper; Corinne H Rocca; Kirsten M Thompson; Johanna Morfesis; Suzan Goodman; Philip D Darney; Carolyn L Westhoff; J Joseph Speidel
Journal:  Lancet       Date:  2015-06-16       Impact factor: 79.321

8.  Immediate postpartum long-acting reversible contraception: the time is now.

Authors:  Michelle Moniz; Tammy Chang; Michele Heisler; Vanessa K Dalton
Journal:  Contraception       Date:  2016-11-29       Impact factor: 3.375

9.  A realist process evaluation within the Facilitating Implementation of Research Evidence (FIRE) cluster randomised controlled international trial: an exemplar.

Authors:  Jo Rycroft-Malone; Kate Seers; Ann Catrine Eldh; Karen Cox; Nicola Crichton; Gill Harvey; Claire Hawkes; Alison Kitson; Brendan McCormack; Christel McMullan; Carole Mockford; Theo Niessen; Paul Slater; Angie Titchen; Teatske van der Zijpp; Lars Wallin
Journal:  Implement Sci       Date:  2018-11-16       Impact factor: 7.327

10.  Ongoing barriers to immediate postpartum long-acting reversible contraception: a physician survey.

Authors:  Emily C Holden; Erica Lai; Sara S Morelli; Donald Alderson; Jay Schulkin; Neko M Castleberry; Peter G McGovern
Journal:  Contracept Reprod Med       Date:  2018-11-08
View more
  2 in total

1.  The STS case study: an analysis method for longitudinal qualitative research for implementation science.

Authors:  Jennifer M Van Tiem; Heather Schacht Reisinger; Julia E Friberg; Jaime R Wilson; Lynn Fitzwater; Ralph J Panos; Jane Moeckli
Journal:  BMC Med Res Methodol       Date:  2021-02-05       Impact factor: 4.615

2.  Facility-based simulation as a programmatic tool for implementing a statewide contraceptive initiative.

Authors:  Susanna R Cohen; Jami Baayd; Gabriela García; Caitlin Quade; Alexandra Gero; Madison Ekey; Catherine Poggio; Rebecca Simmons
Journal:  BMC Health Serv Res       Date:  2022-07-29       Impact factor: 2.908

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.