| Literature DB >> 22839967 |
Martin P Eccles1, Robbie Foy, Anne Sales, Michel Wensing, Brian Mittman.
Abstract
Implementation Science has been published for six years and over that time has gone from receiving 100 articles in 2006 to receiving 354 in 2011; our impact factor has risen from 2.49 in June 2010 to 3.10 in June 2012. Whilst our article publication rate has also risen, it has risen much less slowly than our submission rate--we published 29 papers in 2006 and 134 papers in 2011 and we now publish only around 40 % of submissions. About one-half of submitted manuscripts are rejected without being sent out for peer review; it has become clear that there are a number of common issues that result in manuscripts being rejected at this stage. We hope that by publishing this editorial on our common reasons for rejection without peer review we can help authors to better judge the relevance of their papers to Implementation Science.Entities:
Mesh:
Year: 2012 PMID: 22839967 PMCID: PMC3443070 DOI: 10.1186/1748-5908-7-71
Source DB: PubMed Journal: Implement Sci ISSN: 1748-5908 Impact factor: 7.327
Summary of issues that influence the likelihood rejection without review of articles submitted to Implementation Science
| Field of interest | Healthcare and public health | Anything else |
| Effectiveness studies | Evaluating the introduction of an intervention/evidence-based practice of known effectiveness | Evaluating the effectiveness of a clinical, organizational, public health, or policy intervention |
| Outcome | Health or health-related | Anything else |
| Implementation | Researching implementation | Doing implementation |
| Validity | Maximizes internal and external validity as appropriate in the chosen study designs | |
| Patient decision aids | Evaluations of the introduction of patient decision aids (of known effectiveness) into healthcare care settings; involvement of healthcare providers | Initial development or pilot testing of patient decision aids |
| Implementation (Knowledge Translation) direct to patients | Outcomes referring to evidence-based practice with some involvement of healthcare providers | Other types of outcomes |
| Intervention development reports | Prepared and submitted prior to the reporting of the effectiveness of the intervention | |
| | Going to be, (robustly) evaluated | Not going to be (robustly) evaluated |
| Process evaluation | Submitted contemporaneously with or following report of intervention effectiveness | Process evaluations submitted in advance of the conduct of the main effectiveness analysis (it cannot be clear if they are explaining an effect or the absence of an effect) |
| | Process evaluations that take account of the main evaluation outcomes | Process evaluations that do not take account of the main evaluation outcomes |
| Pilot studies | If appropriate criteria for conduct | No justification for conduct |
| | If appropriate degree of inference | Overclaim on basis of results |
| | If there are plans for further evaluation | |
| Protocols | Been through (inter)national level peer review as part of their funding | Not been through national level peer review as part of their funding |
| | Received ethics review board approval | Not received ethics review board approval |
| Submitted prior to data cleaning or analysis | Have begun data cleaning or analysis (may not apply to some qualitative studies) |