| Literature DB >> 34077797 |
Ben Van Calster1, Laure Wynants2, Richard D Riley3, Maarten van Smeden4, Gary S Collins5.
Abstract
Covid-19 research made it painfully clear that the scandal of poor medical research, as denounced by Altman in 1994, persists today. The overall quality of medical research remains poor, despite longstanding criticisms. The problems are well known, but the research community fails to properly address them. We suggest that most problems stem from an underlying paradox: although methodology is undeniably the backbone of high-quality and responsible research, science consistently undervalues methodology. The focus remains more on the destination (research claims and metrics) than on the journey. Notwithstanding, research should serve society more than the reputation of those involved. While we notice that many initiatives are being established to improve components of the research cycle, these initiatives are too disjointed. The overall system is monolithic and slow to adapt. We assert that top-down action is needed from journals, universities, funders and governments to break the cycle and put methodology first. These actions should involve the widespread adoption of registered reports, balanced research funding between innovative, incremental and methodological research projects, full recognition and demystification of peer review, improved methodological review of reports, adherence to reporting guidelines, and investment in methodological education and research. Currently, the scientific enterprise is doing a major disservice to patients and society.Entities:
Keywords: Methodology; Reporting; Research quality
Mesh:
Year: 2021 PMID: 34077797 PMCID: PMC8795888 DOI: 10.1016/j.jclinepi.2021.05.018
Source DB: PubMed Journal: J Clin Epidemiol ISSN: 0895-4356 Impact factor: 6.437
Issues resulting from the current organization of science that lead to research waste
| Problem | Description |
|---|---|
| Research incentives focus on quantity, rather than methodological quality | Scientists are rewarded to rapidly churn out publications that are often poorly designed or use poor-quality data[ |
| Funders and journals prioritize novelty over incremental and replication research | Funding calls often focus on innovative (though high risk) ideas, sometimes with a guarantee that the project will succeed. Such guarantee may come from (often unfunded) preliminary results. Such requirements encourage researchers to run before they can walk. Often, funders and journals do not prioritize incremental and replication research due to perceived lack of novelty. Yet incremental and replication research is essential to confirm, expand, or refute reported breakthroughs [ |
| Researchers’ agendas are dictated by short-term deadlines | Researchers are confronted with numerous deadlines related to grant proposals, conference submissions, training requirements, and doctoral dissertations. For all of these deadlines, it is commonplace to present some study findings. To fulfill this demand, methodological quality is often compromised. Examples include premature end of patient recruitment, unplanned interim analyses, use of poorly cleaned data, and small and poorly conceived studies. Such shortcuts lead to the dissemination of misleading or premature results |
| Peer review remains unacknowledged | Peer review is one of the only stages in the scientific process where the quality of research plans and findings can be evaluated in detail |
| Methodological illiteracy is still accepted | It is a persisting problem that many researchers know too little about methodology and many studies are conducted with no or little involvement of adequately trained methodologists/statisticians for the research at hand |
| Transparent and complete reporting remains rare | While such reporting is vital for understanding and reproducibility, systematic reviews repeatedly indicate that reporting remains incomplete |
Practices resulting from prioritizing publication appearance over publication quality
| Practice | Description |
|---|---|
| Poor study preparation and design | Many studies are poorly designed and ill-prepared, with an insufficiently detailed or inaccessible research protocol (if one exists at all)[ |
| Data or analysis tweaking (e.g. p-hacking) | Many publications contain results that are not fully honest, by tweaking the data or analysis procedures or even data fabrication[ |
| Incomplete reporting | Key information needed to understand how a study was carried out and what was found is often simply not mentioned in publications |
| Selective reporting | Many publications suffer from selective reporting by focusing on the most interesting or surprising results |
| Spin | The interpretation and conclusions of study results are often too strong even after peer review, a phenomenon called ‘spin’ |
| Publication bias | A manuscript that reports on a study with less appealing or ‘negative’ results are historically less likely to be submitted for publication and accepted by journals than other manuscripts. This is the well-known and long-standing problem of publication bias |
| HARKing (hypothesizing after the results are known) | HARKing means that parts of a publication (such as the introduction and the hypothesis) are written to accommodate the final results |
| Salami-slicing | The data resulting from a study are often presented in multiple publications that are highly similar. The study results are split into ‘minimal publishable units’ beyond what is reasonable. For example, researchers may write several papers by simply changing the outcomes or variables of interest for each paper. |
| Reluctance to take corrective action post hoc. | Published papers frequently contain errors, yet journals are not always eager to take corrective action when errors are highlighted |
Examples of initiatives to improve the methodology and reproducibility of research
| Topic / initiative | Description |
|---|---|
| Establishment of reproducibility networks and research centers | Reproducibility networks/centers aim to improve the robustness of scientific research by investigating how research can be improved, and sharing best practices through trainings and workshops. Importantly, these networks aim to collaborate with stakeholders (funders, publishers, academic organizations) in order to broadly improve research practices. See |
| Lancet series on research waste in 2014 | 17 recommendations for researchers, academic institutions, scientific journals, funding agencies and science regulators were provided |
| Hong Kong principles for research assessment | The Hong Kong principles focus on responsible research practices, transparent reporting, open science, valuing a diversity of research, and recognizing all contributions to research and scholarly activity |
| EQUATOR network (Enhancing the QUAlity and Transparency Of health Research) | The EQUATOR Network ( |
| STRATOS (STRengthening Analytical Thinking for Observational Studies. | The STRATOS initiative unites methodological experts to prepare guidance documents regarding the design and analysis of observational studies ( |
| Center for Open Science (COS) | COS is a center which mission it is to ‘increase openness, integrity, and reproducibility’ of research (cos.io) |
| Study registries | Study registries make study information publicly available at the start of the study, to improve transparency and completeness and allow comparison to resulting publications (e.g., clinicaltrials.gov, crd.york.ac.uk/prospero). Registration is widely established for interventional studies, and slowly getting more attention for observational studies. Recently, initiatives for animal studies are being taken ( |
| Registered reports | COS has introduced the registered reports system ( |
| Transparence and Openness Promotion (TOP) Committee | TOP, also under the umbrella of COS, provides guidelines to support journals’ policies for the publication of papers ( |
| Findability, Accessibility, Interoperability, and Reusability (FAIR) principles | FAIR provides guiding principles for data sharing, which is important for transparency and utility of research projects |
| Methodological/statistical reviewing | Several medical journals recognize the importance of methodological review (e.g., statisticians, information specialists/librarians), although the implementation varies widely. Some journals decide on an ad hoc basis when statistical input is required, although this decision may itself require statistical input. Some journals include statisticians on the editorial board, whilst some journals hire a team of statisticians and methodologists. |
| Reviewer recognition (e.g. Publons) | Initiatives such as Publons ( |
| Replication grants | The Dutch Research Council ( |
All mentioned URLs were accessed on May 23rd 2021.