Literature DB >> 28178987

Enhancing the reporting of implementation research.

Paul M Wilson1, Anne Sales2,3, Michel Wensing4, Gregory A Aarons5, Signe Flottorp6, Liz Glidewell7, Alison Hutchinson8, Justin Presseau9, Anne Rogers10, Nick Sevdalis11, Janet Squires12, Sharon Straus13.   

Abstract

In the 10 years since the inception of Implementation Science, we have witnessed a continued rise in the number of submissions received, reflecting the continued global interest in methods to enhance the uptake of research findings into healthcare practice and policy. We receive over 750 submissions annually, and there is now a large gap between what is submitted and what gets published. In this editorial, we restate the journal scope and current boundaries. We also identify some specific reporting issues that if addressed will help enhance the scientific reporting quality and transparency of the manuscripts we receive. We hope that this editorial acts as a further guide to researchers seeking to publish their work in Implementation Science.

Entities:  

Mesh:

Year:  2017        PMID: 28178987      PMCID: PMC5299701          DOI: 10.1186/s13012-017-0546-3

Source DB:  PubMed          Journal:  Implement Sci        ISSN: 1748-5908            Impact factor:   7.327


Background

In the 10 years since the inception of Implementation Science, we have witnessed a continued rise in the number of manuscripts submitted. We now receive over 750 submissions annually (see Fig. 1), reflecting the continued interest from researchers, funders and health professionals and policy makers in promoting the uptake of research findings into healthcare practice and policy. The number of manuscripts published in Implementation Science remains steady at around 150 per year.
Fig. 1

Manuscripts submitted to and accepted for publication in Implementation Science

Manuscripts submitted to and accepted for publication in Implementation Science The large gap between what is submitted and what gets published is driven by two key issues, namely scope and scientific quality. This editorial aims to address both of these issues and act as a further guide to researchers seeking to publish their work in Implementation Science.

Scope and boundaries

In 2015, we reviewed and provided a detailed explanation and elaboration of our journal scope [1]. As of 2017, we have no plans to expand further the boundaries of our scope at this point in time. Therefore, our focus remains on the publication of studies examining the implementation of evidence-based healthcare interventions, practices or policies or the de-implementation of those demonstrated to be of low or no clinical benefit or even harmful. For implementation effectiveness, we seek to publish studies that employ rigorous experimental or quasi-experimental designs regardless of whether they report effects or no effects. By rigorous, we mean those designs that would be eligible for inclusion in the Cochrane EPOC reviews [2]. This can include type 2 or type 3 hybrid designs where there is a dual a priori focus on assessing clinical effectiveness and implementation strategies, [3] but only where there is a clear justification and major element of implementation research. Type 2 designs have dual focus effectiveness and implementation outcomes, here, for example, testing both the effectiveness of brief cognitive behavioural therapy and the implementation strategies [4]. A type 3 is where the primary emphasis is on evaluating implementation, in this instance of a diabetes prevention programme, but where data on clinical outcomes are also collected [5]. We continue to receive a considerable number of studies testing novel clinical or population health interventions, where the effectiveness of the intervention or practice has yet to be established. As our scope focuses on the implementation of interventions of demonstrated effectiveness, we routinely reject these manuscripts (and offer transfer to other BMC journals). These exclusion criteria extend also cover type 1 hybrid designs where the focus is on testing effects of a clinical intervention on relevant outcomes whilst observing and gathering information on implementation [3]. For instance, a clinical trial of primary care management of survivors of sepsis focused on patients’ quality of life as the primary outcome also comprised a range of measures of implementation aspects [6]. Studies of this type fall outside of our journal scope. Alongside effectiveness, the journal scope also includes economic evaluation and qualitative research that examines different aspect of interventions and context which contribute to effectiveness. This includes the study of adaptation and fidelity, mechanisms of impact and contextual influences on implementation and outcomes, sustainability and scalability as well as the study of influences on provider, patient and organisational behaviour. Crucially, we expect the methods employed in such studies to be an appropriate fit to the question(s) being addressed and be informed by relevant conceptual frameworks. We also welcome articles that present new methods and articles that question or challenge existing implementation policies, practices, evidence or theory and suggest modifications or alternatives. However, it is worth noting that there is no shortage of frameworks and theories relevant to implementation research [7, 8]. So rather than adding to the current pot, our preference is for empirical studies that build and advance the existing theoretical base. With debate papers, we reject those that fail to ground the central argument within the existing implementation research literature. Many debate papers would be of greater relevance if the arguments posed were based upon systematic reviews of the relevant evidence. Table 1 presents the types of manuscripts likely to be accepted by or rejected from Implementation Science. This should assist prospective authors to judge whether the journal is a suitable home for their research.
Table 1

Factors promoting the likelihood of acceptance or rejection from Implementation Science by manuscript type

Type of manuscriptFactors promoting likelihood of acceptanceFactors promoting likelihood of rejectionPreferred reporting methods
DebatePapers which question or challenge existing implementation policies, practices, evidence or theory and suggest modifications or alternativesPapers which fail to contextualise in the literature or demonstrate how they build upon the existing implementation research literatureN/A
EffectivenessStudies that fit our journal scope and that employ rigorous experimental or quasi experimental designs (i.e. designs eligible for inclusion in Cochrane EPOC reviews)AndEvaluate the implementation of an evidence-based practice or policy or de-implementation of those demonstrated to be of low or no clinical benefitStudies which lack a rigorous study design such as quality improvement reports, service evaluations or uncontrolled before-after studiesStudies evaluating the effectiveness of novel clinical, organisational, public health or policy interventionsCONSORT for Trials
Economic evaluationAny cost effectiveness analysis that compares the costs and outcomes of two or more implementation strategiesCost and cost consequences analysis where disaggregated costs and outcomes are presentedCHEERS
Intervention development reportsPrepared and submitted prior to the reporting of the effectiveness of the interventionPlans for (robust) evaluation are made explicitProviding empirical and/or theoretical rationalePost hoc submission (submitted after the reporting of the effectiveness of the intervention)No plans for (robust) evaluation
MethodologyArticles that present methods which may either be completely new or offer an improvement to an existing methodArticles reporting empirical comparisons of one or more methodological approaches or which clearly state what they add to existing literatureDescriptive accounts of largely established methods without any associated novel methodological insightsN/A
Pilot and feasibility studiesStudies that fit our journal scope and conducted with the explicit purpose of assessing feasibility and planning for an intervention that is expected to contribute to existing knowledgeStudies indicating how a subsequent study will draw from the pilot studyClear plans for further evaluation or where there are clear reasons for notNo justification for conductOver claim on basis of results
Process evaluationStudies that fit our journal scope and are submitted contemporaneously with or following reports of intervention effectiveness and that take account of the main evaluation outcomesStudies evaluating fidelity of implementation, mechanisms of impact and or contextual influences on implementation and outcomesProcess evaluations submitted in advance of the conduct of the main effectiveness analysis (it cannot be clear if they are explaining an effect or the absence of an effect)Process evaluations that do not take account of the main evaluation outcomes
ProtocolsProtocols that fit our journal scope and inclusion criteria for rigorous study designsAndThat have been through a competitive peer review process to receive funding from a nationally or internationally recognised research agencyAndThat have received appropriate ethics review board approvalAndThat have been submitted within three possible time points: (1) Within 3 months of ethics approval, (2) Prior to enrolment of the first participant/cluster (3) Before the end of participant/cluster recruitment (i.e. prior to the commencement of data cleaning or analysis)Protocols that have not been the subject of peer review by a national or international research agencyProtocols that have received ethics review board approvalProtocols for quality improvement or service evaluations, which lack a rigorous study designProtocols for pilot or feasibility studiesProtocols for systematic reviews and other types of synthesis (we usually refer these to the BMC journal, systematic reviews)Protocols that are submitted for studies where data cleaning and analysis have begunAs SPIRIT is developed for clinical trials, we prefer authors to complete as far as they can the CONSORT checklist or appropriate extension
Qualitative studiesStudies that fit the journal scope and meet applicable criteria for quality and validityStudies where there are doubts whether planned data saturation has been achievedSingle site case studies with limited typicalityStudies that fail to link to relevant theory or without contextualisation and with little reference to previous relevant qualitative studies or reviews
Short reportsBrief reports of data from original research which present relatively modest advances in knowledge or methodsReports of meetings, ‘doing implementation’ or ‘lessons learned’N/A
Systematic reviews and other synthesesSystematic reviews and other types of synthesis (such as rapid, realist or scoping) that fit our journal scope and which may cover issues such as the effects of implementation interventions and or influences on the uptake of evidenceNon-systematic or narrative literature reviews that fail to use explicit methods to identify, select, and critically appraise relevant researchReviews and syntheses that fail to adhere to recognised quality and reporting standardsPRISMARAMESES for realist reviews
Factors promoting the likelihood of acceptance or rejection from Implementation Science by manuscript type

Enhancing reporting

Alongside failure to meet scope requirements, poor scientific quality remains a common reason for rejection. Promoting the development, refinement and quality of implementation research was a key aim of the founding editors [9] and remains so today. We therefore continue to support and promote efforts to improve research quality and transparency.

Prospective trial registration

Implementation Science supports initiatives to improve the reporting of randomised trials. We have adopted the ICMJE recommendation [10] and only normally consider for publication trials that have been registered with an appropriate publicly available trials database prior to enrolment of the first participant/cluster. We will consider retrospectively registered trials on a case by case basis but will require authors to explain the reason(s) for the delayed registration. Whilst there are no fixed rules about the registration of other study designs, we strongly encourage authors of systematic reviews to prospectively register their review with PROSPERO or other publicly accessible registries.

Enhancing research reporting

Over the last decade we have routinely required authors submitting manuscripts that report trials to complete the CONSORT checklist or relevant extension. Similarly, a requirement to complete the PRISMA checklist has been enforced for authors submitting systematic reviews. No other checklists have been routinely or uniformly enforced. As a journal that receives manuscripts covering a wide range of study designs, this has resulted in variation in the standards of reporting of the research that we publish. Because our aim is to promote research quality and transparency, and as an aid to our readers, reviewers and editors, we now require authors submitting manuscripts (regardless of study design) to complete and include a design appropriate reporting checklist. The website of the EQUATOR Network provides details of all available reporting guidelines (www.equator-network.org). Authors of manuscripts (regardless of study design) should refer to EQUATOR and ensure that they complete and include a design appropriate reporting checklist with their submission. Table 1 includes details of our preferred reporting formats; for those research types where consensus is lacking on reporting format (for example, in qualitative research), we encourage authors to select their preferred checklist. Improving the quality of intervention description is as much an issue for implementation research as it is for other evaluations of complex interventions. Without sufficient detail, it is difficult for readers to determine what was actually implemented and/or for other researchers to use or replicate the intervention in other studies. Whilst TIDieR has been proposed for use in conjunction with the CONSORT guidelines for trials, [11] improved intervention description is relevant across all evaluative study designs. Other relevant standards for reporting implementation interventions (Standards for Reporting Implementation studies —StaRI) [12] and for reporting behaviour change interventions (Workgroup for Intervention Development and Evaluation Research—WIDER) [13] have been developed and are available. We encourage authors to select their preferred guideline to enhance reporting of interventions. With all submissions, we expect authors to clearly articulate what is already known and what their work adds to existing knowledge, theory and thinking in the field. Many submissions currently fail to set the work in the context of the existing literature. And so we will continue to reject manuscripts that do not clearly build on current knowledge and understanding or appear to provide limited contributions.

Open Science

As an open access journal (with open peer review), we are committed to making research and the datasets upon which it is based, publicly accessible. A number of different data sharing approaches have now been adopted across the health and medical literature [14]. At Implementation Science, we have adopted the policies on data availability of our publisher BMC. As part of online article submission, we now ask authors to include an “Availability of Data and Materials” section in their manuscript detailing the conditions by which the data supporting their findings can be accessed. Authors who do not wish to share their data must include a formal statement that data will not be shared and give the reason why. Full details of BMC policies can be found under the Instructions for Authors section of our website.

Conclusion

In this editorial, we have identified some specific reporting issues that if addressed will help enhance the scientific reporting quality and transparency of the manuscripts we receive. We also encourage prospective authors to familiarise themselves with the journal scope and boundaries before making a submission. We look forward to the next 10 years as the field continues to grow and evolve and to receiving research that continues to enhance the uptake of evidence-based practices or policies to improve the quality and delivery of healthcare.
  11 in total

1.  Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact.

Authors:  Geoffrey M Curran; Mark Bauer; Brian Mittman; Jeffrey M Pyne; Cheryl Stetler
Journal:  Med Care       Date:  2012-03       Impact factor: 2.983

Review 2.  Bridging research and practice: models for dissemination and implementation research.

Authors:  Rachel G Tabak; Elaine C Khoong; David A Chambers; Ross C Brownson
Journal:  Am J Prev Med       Date:  2012-09       Impact factor: 5.043

3.  Effect of a Primary Care Management Intervention on Mental Health-Related Quality of Life Among Survivors of Sepsis: A Randomized Clinical Trial.

Authors:  Konrad Schmidt; Susanne Worrack; Michael Von Korff; Dimitry Davydow; Frank Brunkhorst; Ulrike Ehlert; Christine Pausch; Juliane Mehlhorn; Nico Schneider; André Scherag; Antje Freytag; Konrad Reinhart; Michel Wensing; Jochen Gensichen
Journal:  JAMA       Date:  2016-06-28       Impact factor: 56.272

4.  Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide.

Authors:  Tammy C Hoffmann; Paul P Glasziou; Isabelle Boutron; Ruairidh Milne; Rafael Perera; David Moher; Douglas G Altman; Virginia Barbour; Helen Macdonald; Marie Johnston; Sarah E Lamb; Mary Dixon-Woods; Peter McCulloch; Jeremy C Wyatt; An-Wen Chan; Susan Michie
Journal:  BMJ       Date:  2014-03-07

5.  Brief cognitive behavioral therapy in primary care: a hybrid type 2 patient-randomized effectiveness-implementation design.

Authors:  Jeffrey A Cully; Maria E A Armento; Juliette Mott; Michael R Nadorff; Aanand D Naik; Melinda A Stanley; Kristen H Sorocco; Mark E Kunik; Nancy J Petersen; Michael R Kauth
Journal:  Implement Sci       Date:  2012-07-11       Impact factor: 7.327

Review 6.  Developing standards for reporting implementation studies of complex interventions (StaRI): a systematic review and e-Delphi.

Authors:  Hilary Pinnock; Eleni Epiphaniou; Aziz Sheikh; Chris Griffiths; Sandra Eldridge; Peter Craig; Stephanie J C Taylor
Journal:  Implement Sci       Date:  2015-03-30       Impact factor: 7.327

7.  Implementation science: a reappraisal of our journal mission and scope.

Authors:  Robbie Foy; Anne Sales; Michel Wensing; Gregory A Aarons; Signe Flottorp; Bridie Kent; Susan Michie; Denise O'Connor; Anne Rogers; Nick Sevdalis; Sharon Straus; Paul Wilson
Journal:  Implement Sci       Date:  2015-04-17       Impact factor: 7.327

8.  Making sense of implementation theories, models and frameworks.

Authors:  Per Nilsen
Journal:  Implement Sci       Date:  2015-04-21       Impact factor: 7.327

Review 9.  Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations.

Authors:  Lauren Albrecht; Mandy Archibald; Danielle Arseneau; Shannon D Scott
Journal:  Implement Sci       Date:  2013-05-16       Impact factor: 7.327

10.  Sharing all types of clinical data and harmonizing journal standards.

Authors:  Corrado Barbui
Journal:  BMC Med       Date:  2016-04-03       Impact factor: 8.775

View more
  14 in total

1.  Assessing Implementation Strategy Reporting in the Mental Health Literature: A Narrative Review.

Authors:  Cole Hooley; Takashi Amano; Lara Markovitz; Lauren Yaeger; Enola Proctor
Journal:  Adm Policy Ment Health       Date:  2020-01

2.  Strategies to improve the implementation of healthy eating, physical activity and obesity prevention policies, practices or programmes within childcare services.

Authors:  Luke Wolfenden; Courtney Barnes; Jannah Jones; Meghan Finch; Rebecca J Wyse; Melanie Kingsland; Flora Tzelepis; Alice Grady; Rebecca K Hodder; Debbie Booth; Sze Lin Yoong
Journal:  Cochrane Database Syst Rev       Date:  2020-02-10

3.  Effectiveness of implementation strategies for the improvement of guideline and protocol adherence in emergency care: a systematic review.

Authors:  Remco H A Ebben; Flaka Siqeca; Ulla Riis Madsen; Lilian C M Vloet; Theo van Achterberg
Journal:  BMJ Open       Date:  2018-11-25       Impact factor: 2.692

4.  Implementing an initiative to promote evidence-informed practice: part 1 - a description of the Evidence Rounds programme.

Authors:  Aislinn Conway; Maura Dowling; Áine Binchy; Jane Grosvenor; Margaret Coohill; Deirdre Naughton; Jean James; Declan Devane
Journal:  BMC Med Educ       Date:  2019-03-06       Impact factor: 2.463

5.  Implementation science and stigma reduction interventions in low- and middle-income countries: a systematic review.

Authors:  Christopher G Kemp; Brooke A Jarrett; Churl-Su Kwon; Lanxin Song; Nathalie Jetté; Jaime C Sapag; Judith Bass; Laura Murray; Deepa Rao; Stefan Baral
Journal:  BMC Med       Date:  2019-02-15       Impact factor: 8.775

6.  Strengthening methods for tracking adaptations and modifications to implementation strategies.

Authors:  Amber D Haley; Byron J Powell; Callie Walsh-Bailey; Molly Krancari; Inga Gruß; Christopher M Shea; Arwen Bunce; Miguel Marino; Leah Frerichs; Kristen Hassmiller Lich; Rachel Gold
Journal:  BMC Med Res Methodol       Date:  2021-06-26       Impact factor: 4.615

7.  Getting messier with TIDieR: embracing context and complexity in intervention reporting.

Authors:  Sarah Cotterill; Sarah Knowles; Anne-Marie Martindale; Rebecca Elvey; Susan Howard; Nia Coupe; Paul Wilson; Michael Spence
Journal:  BMC Med Res Methodol       Date:  2018-01-18       Impact factor: 4.615

Review 8.  Effective strategies for scaling up evidence-based practices in primary care: a systematic review.

Authors:  Ali Ben Charif; Hervé Tchala Vignon Zomahoun; Annie LeBlanc; Léa Langlois; Luke Wolfenden; Sze Lin Yoong; Christopher M Williams; Roxanne Lépine; France Légaré
Journal:  Implement Sci       Date:  2017-11-22       Impact factor: 7.327

9.  Developing an implementation strategy for a digital health intervention: an example in routine healthcare.

Authors:  Jamie Ross; Fiona Stevenson; Charlotte Dack; Kingshuk Pal; Carl May; Susan Michie; Maria Barnard; Elizabeth Murray
Journal:  BMC Health Serv Res       Date:  2018-10-19       Impact factor: 2.655

10.  Implementation outcome assessment instruments used in physical healthcare settings and their measurement properties: a systematic review protocol.

Authors:  Zarnie Khadjesari; Silia Vitoratou; Nick Sevdalis; Louise Hull
Journal:  BMJ Open       Date:  2017-10-08       Impact factor: 2.692

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.