| Literature DB >> 21858110 |
Alicia M Grubb1, Steve M Easterbrook.
Abstract
This study set out to explore the views and motivations of those involved in a number of recent and current advocacy efforts (such as open science, computational provenance, and reproducible research) aimed at making science and scientific artifacts accessible to a wider audience. Using a exploratory approach, the study tested whether a consensus exists among advocates of these initiatives about the key concepts, exploring the meanings that scientists attach to the various mechanisms for sharing their work, and the social context in which this takes place. The study used a purposive sampling strategy to target scientists who have been active participants in these advocacy efforts, and an open-ended questionnaire to collect detailed opinions on the topics of reproducibility, credibility, scooping, data sharing, results sharing, and the effectiveness of the peer review process. We found evidence of a lack of agreement on the meaning of key terminology, and a lack of consensus on some of the broader goals of these advocacy efforts. These results can be explained through a closer examination of the divergent goals and approaches adopted by different advocacy efforts. We suggest that the scientific community could benefit from a broader discussion of what it means to make scientific research more accessible and how this might best be achieved.Entities:
Mesh:
Year: 2011 PMID: 21858110 PMCID: PMC3157385 DOI: 10.1371/journal.pone.0023420
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Respondents by Research Field.
| Research Field | Count |
| Applied Science | 0 |
| Astronomy | 0 |
| Biology | 7 |
| Chemistry | 3 |
| Earth Science | 0 |
| Environmental Science | 0 |
| Life Science | 4 |
| Medicine | 1 |
| Physics | 2 |
| Psychology | 1 |
| Other | 1 |
The complete list of respondents' representative research field.
Figure 1Plot for Q-3.
Major emergent axes (knowledge and support) from Q-3 “How do you think the general public (non-scientists) views your particular field's research efforts?”.
Q-4 Axes.
|
| |
| positive: | awe, useful |
| negative: | caution, uninformed passion, unaware, boring, abstract, social misfits |
|
| |
| positive: | |
| negative: | unaware, uninformed, can't understand details |
|
| |
| positive: | positive, useful, and important |
| negative: | bad, dangerous, suspicious motives, mistrust |
|
| |
| positive: | medicine, cure, space, human impact |
| negative: | boring research, fruit flies |
Major emergent axes (knowledge, support, subject variability and engagement) from Q-4 “How do you think the general public (non-scientists) views the efforts of the scientific community as a whole?”.
Percentage of Experiments Conducted that are Published.
| Percentages | Number of Respondents |
| 0–19 | 6 |
| 20–39 | 3 |
| 40–59 | 2 |
| 60–79 | 1 |
| 80–100 | 2 |
Distribution of responses to question Q-6 “To your best approximation, what percentage of the experiments you conduct end up in published work?”.
Percentage of Experiments Replicated and Should be Replicated.
| Number of Respondents | ||
| Percentages | % of expts ARE replicated | % of expts SHOULD BE replicated |
| 0–19 | 9 | 4 |
| 20–39 | 3 | 1 |
| 40–59 | 2 | 1 |
| 60–79 | 0 | 0 |
| 80–100 | 0 | 2 |
Distribution of responses to question Q-7 “To your best approximation, what percentage of the experiments conducted in your field are replicated?” and Q-8 “In your opinion, what percentage of the experiments conducted in your field should be replicated?”.
Error Rates in Peer Reviewed Journals (Number of Responses).
| Percentage | (A) Incorrect Citations | (B) Incorrect Figures | (C) Incorrect Facts | (D) Flawed Results |
|
| 0 | 0 | 0 | 0 |
| 1–4 | 3 | 3 | 3 | 2 |
| 5–9 | 5 | 8 | 5 | 5 |
| 10–19 | 2 | 1 | 1 | 1 |
| 20–39 | 2 | 0 | 4 | 1 |
| 40–59 | 1 | 0 | 0 | 3 |
| 60–79 | 1 | 2 | 0 | 0 |
| 80–100 | 0 | 0 | 1 | 2 |
Distribution of responses to question Q-9 “To your best approximation, what percentage of papers published in a typical peer reviewed journal in your field (in the past year) contains: (A) incorrect citation (B) incorrect figures (C) incorrect facts (D) flawed results.”
Benefits of Peer Review.
| Paper Quality | Journal Standard |
| find errors, improve language elements, constructive criticism, second eyes, identify related work, fact checking, comments, and suggestions | weed out unimportant work, minimize bad research, maintain scientific merit, reduce frequency of publication, and quality filter |
Major emergent axes (paper quality and journal standard) as benefits of the peer review process as discussed by the respondents (Q-11).
Time frame for Data and Results Availability.
| Time | Count |
| As Soon As Possible | 4 |
| After Review | 1 |
| After Publication | 8 |
| Within Reason | 1 |
Distribution of responses to questions Q-17 “What point in time should your data and results be available?”.