Literature DB >> 31543946

The case for openness in engineering research.

Devin R Berg1, Kyle E Niemeyer2.   

Abstract

In this article, we describe our views on the benefits, and possible downsides, of openness in engineering research. We attempt to examine the issue from multiple perspectives, including reasons and motivations for introducing open practices into an engineering researcher's workflow and the challenges faced by scholars looking to do so. Further, we present our thoughts and reflections on the role that open engineering research can play in defining the purpose and activities of the university. We have made some specific recommendations on how the public university can recommit to and push the boundaries of its role as the creator and promoter of public knowledge. In doing so, the university will further demonstrate its vital role in the continued economic, social, and technological development of society. We have also included some thoughts on how this applies specifically to the field of engineering and how a culture of openness and sharing within the engineering community can help drive societal development.

Entities:  

Keywords:  engineering; open access; open engineering; open science; research dissemination

Mesh:

Year:  2018        PMID: 31543946      PMCID: PMC6733375          DOI: 10.12688/f1000research.14593.2

Source DB:  PubMed          Journal:  F1000Res        ISSN: 2046-1402


Introduction

Working openly should be the default mode of science—after all, how can we advance knowledge “by standing on the shoulders of giants” [a] if we cannot access or see those shoulders? As there is no clear consensus as to how to define open science [1], this paper operates on the following definition, which was first laid out by Niemeyer [2] and represents a synthesis of other available definitions [b]: Open science, or more broadly open research, describes the activity of performing scientific research in a manner that makes the products and findings accessible to anyone. This includes sharing data openly (open data), publicly releasing the source code for research software under a permissive license (free and open-source software), publicly releasing the designs of research hardware under a permissive license (free and open-source hardware), and making the written products of research openly accessible (open access). The field of engineering provides an interesting case study for examining the impacts of open practices since engineering touches every aspect of human life. Engineering research is inherent to the development of goods and products such as medical devices and pharmaceuticals, so issues around the protection of intellectual property and innovation draw stark contrasts, for some, with the tenets of open science. On the other hand, work being done in the free and open-source software (FOSS) and free and open-source hardware (FOSH) movements can enable us to engineer the tools of modern scientific discovery, greatly reducing the costs of scientific research [3]. These two movements derive from two complementary principles, that the software source code and hardware design should be openly released and licensed for reuse and modification [4, 5]. In a time of constrained university budgets, which are not expected to improve as long as most public universities rely heavily on public funding from state and federal sources, many universities are being forced to evaluate their institutional priorities [6]. For some, particularly state universities subject to the whims of state legislation, this could mean abandoning the pursuit of fundamental or basic knowledge generation in favor of marketable vocational training models that cater more directly to industry needs. While this model is in line with the Morrill Act of 1862, which created land-grant colleges and underpins the missions of many US institutions, the university has evolved since that time to encompass a much greater proportion of the economic development of the country [6]. Despite the challenges faced by institutions today, our opinion is that it is critical for the university to continue to position itself as a center of societal development—economically, technologically, and socially. Additionally, under our interpretation of the mission of the land-grant university program, the university should push this model further towards positioning itself as the main driver of social and technological innovation in its geographic region. To achieve this, it is necessary to organize and market the business of the university, as clearly as possible, as a service provider to many relevant stakeholders. This can be best accomplished by disseminating and distributing the products of university activities as widely as possible through open-access publishing, open research, and open innovation, and further demonstrating the impact that these products have on local, state, national, and international populations. As stated by Ashley Farley of the Gates Foundation, “Open research should be the norm. Knowledge should be a public good” [7]. This article seeks to motivate the importance of open science, particularly for engineering research, and synthesizes our earlier white papers [2, 8] separately discussing this topic. We discuss the importance of open science, describe how practicing open science increases the societal impact of research, provide recommendations for researchers to practice open science, summarize challenges to working openly, and conclude with some recommendations for university leaders to promote open (engineering) research. While many of the issues discussed herein apply more broadly beyond the field of engineering, we have elected to focus our discussion on engineering rather than speak for other disciplines.

Importance of open science

Transforming research communities from traditional, closed environments to open ones is important for a number of reasons, including (but not necessarily limited to) the following seven. McKiernan et al. [9] further discuss these and additional benefits for researchers working openly. Tennant et al. [10] review in detail the benefits of open-access publications to academics and society. Value and credit: Conducting open research entails recognizing the value of all of the products of research. This includes all the final, polished products of that work—papers, software, and data—as well as the failures and null results [11, 12]. The dissemination of these artifacts may on occasion comprise an act of humility, but ultimately it recognizes that each of these items is a piece of the research process and that even your failures have value in the lessons you learned and can be passed on to others. Often researchers will contribute significant time and effort to developing products other than journal articles. If this work is not properly credited, career progress can be stifled as a result. One example of this is the development of scientific software, which can support the work of other researchers around the world [13– 16]. If this activity is not supported in terms of career progression, the entire research community suffers as a result. Accessibility: Openness in research ensures that research products, particularly written output, remain accessible to all. This includes the research community, funders, policy makers, and the general public. Accessibility of research products is particularly important for publicly funded research—since the public paid for the research, the public should have access to it and be able to benefit from it. This does not prevent innovators or other parties from developing commercial intellectual property based on the findings, but ensures that the original discovery, when funded by the public, remains accessible to all. While nearly 50% of all published work is available freely through open-access sources, institutional archives, or online social networks, this percentage is notably lower in engineering: approximately 35% [17]. However, Piwowar et al. [18] found that this percentage drops below 20% when not considering articles self-archived on author websites, which can lack assurance of long-term availability. Reproducibility: Releasing products of research, including software and data, helps enable reproducibility. This is particularly true for computational science, where a written description of methods can never describe an approach as completely as the source code [19]. In general, access to research software used to perform a computational study, or the data from an experimental study, should enable others to reproduce the findings of the original researchers. However, open science is a necessary but not sufficient aspect of reproducibility, as it can be challenging to reproduce or replicate results even with available research software and data [20, 21]. Recognition: As a selfish motivation, performing research openly helps increase the recognition received by the work. Studies have shown that open-access papers are cited more in most research fields. In engineering, open-access papers are cited around 1.5 times more often than non-open-access papers [17, 22]. Similarly, papers with associated open data were cited 9–50% more than those without [9, 23]. Vandewalle [24] showed that papers in the image-processing field receive up to three times the number of citations when source code is made available. We must note, however, that the concept of recognition should not be solely regarded through the measure of citations. The true societal impact of the work is likely more important but also more difficult to quantify [25]. This point is discussed further in the following section. Establish priority: Some researchers hesitate to embrace open science out of a fear of being “scooped,” where competitors will use some findings, software tools, or data made available and then publish first. However, contrary to this belief, practicing open science can actually prevent being scooped: releasing preprints can establish priority of discoveries or techniques prior to the publication of a traditional peer-reviewed journal article [26, 27]. The peer-review and editorial process of such papers can take many months or years, but journal articles are still necessary for research findings to be considered valid (and for researchers to receive credit). Publishing a preprint of an article publicly time-stamps the work, even as it undergoes peer review and possible revision. Encourages trust: Embracing openness in scientific research can help encourage other researchers to trust published results, by giving the ability to inspect data or software. Soergel [28] estimated that 5–100% of computational results given by software may be incorrect or inaccurate. While simply releasing source code openly will not solve this problem, this is a necessary step towards verification and reproducibility. It’s nice: In addition to the above benefits, sharing products of research openly is kind to colleagues and the greater research community, as it prevents people from wasting time by unnecessarily repeating work. For example, many graduate students begin working on their dissertation research by attempting to reimplement another group’s methods and reproduce some of their published results. However, lacking access to software source code or datasets can hinder this work. As a result, significant time can be wasted guessing about minor implementation details or inputs not discussed in the corresponding published papers. This can be avoided by sharing the source code and data, which would allow these junior researchers to more quickly move on to new work. Graduate students and other researchers constantly face similar challenges that could be avoided by greater openness in research.

Openness increases societal impact of research

Many published journal articles go unread, even in their topical domains. One study of citation rates found that 27% of papers published in the natural sciences and engineering go uncited [29c]. Those who do read most papers likely come from research institutions similar to those of the authors, even if the findings could be impactful beyond these confines, for example by leading to policy changes or technological solutions for humanitarian purposes. In part, this is due to the challenging technical content, jargon, and niche topics—but it is also due to a lack of access to the journals where most research findings reside. Making the content of these papers actually understandable or digestible by most potential readers is another challenge. Considering the high and ever-increasing cost of scholarly journal subscriptions [30], research results should not be limited to those with the means to purchase access. By self-archiving (i.e., green open access) or publishing articles in open-access journals, researchers can ensure access for all members of society, including policymakers, funders, members of the media, entrepreneurs, and the general public—as well as scientists and engineers in the Global South. Furthermore, being more open with all outputs of research (e.g., papers, software, data) could help improve the general public’s perception and trust in scientific research. Simply making research products available will not solve all of these problems—for one, it will not sway those who strongly believe ideas contrary to fact. However, ensuring everyone has access to the data researchers generate and analyze, and the software tools on which we rely, could eliminate one major barrier to trust in our findings [31]. Looking specifically at the field of engineering, we can also find examples of the positive effects of open knowledge dissemination. According to Chris Ategeka, founder of Health Access Corps, “Patenting a social-impact product hinders scale, ultimately obstructing the maximum impact that particular product would have in the world if it was open source” [32]. Thus, the clear benefit of using open research and development practices is achieving greater impact with your research products. Indeed, from an ethical standpoint, engineers working on technologies targeted at the world’s most vulnerable communities should prioritize the open release of these technologies. The counter argument to this is that, through patenting, the entrepreneur can more easily market and sell their product in developed markets, which could then increase their ability to affect change by subsidizing their efforts in developing nations. This situation may hold true for products with broad appeal and therefore it is necessary for the inventor to assess which path will produce the greatest impact. (This assumes, also, that we encourage and reward impact.) It could be argued that in the majority of scenarios, open dissemination will yield greater impact through simplified adoption and adaptation by others, especially if the front-end development activities are incentivized in other ways. Indeed, we can begin to measure this impact through metrics such as download counts for open source software and hardware projects to demonstrate market penetration. From another perspective, there is some evidence that the pursuit of patents for university research slows innovation [33], while the return on investment for publicly and privately funded research is high [34, 35]. Similarly, the use of FOSS and FOSH may actually increase the return on investment when used to their full potential [36].

Open science and the scholar’s research agenda

For the new researcher looking to build their profile and develop their research agenda, we present a vision and plan for performing research openly, synthesized from literature practices and advice. The ideas presented here are heavily inspired by examples from others active in this area such as Lorena Barba’s Reproducibility PI Manifesto [37], the Peer Reviewers’ Openness Initiative [38], and others. While these exemplars provide useful case studies, it is important to emphasize that each individual must define for themselves a workflow that works for them. Sometimes it is enough to simply be more open than the current norms in their field. Many fields within engineering lack reputable open-access journals; indeed, only 17% of published manuscripts in engineering can be accessed by the public for free legally, while some sub-disciplines, such as chemical engineering, are lower at 9% available [18]. Thus, the engineering researcher looking to publish in open-access venues can quickly become discouraged. Early career researchers looking to make their work available while operating within this research environment can take simple steps such as submitting preprints of any publications to the engrXiv [d], and deposit (otherwise non-accessible) conference papers or slide decks on Figshare [e] or Zenodo [f]. For the researcher looking to develop their open workflow further, we recommend the following steps: Make all written research products openly accessible, either through green or gold open-access avenues. For fields that lack recognized, fully open-access journals, this objective can be met by submitting preprints to services such as arXiv, engrXiv, PeerJ Preprints, Figshare, or Zenodo, depending on the topic. Conference papers, when not submitted to an open venue, can also be made openly available. Where possible, release all preprints under the Creative Commons Attribution (CC BY) license. [g] If funds are available through research grants or library sources designated to support open-access publishing, a researcher may choose to follow the hybrid gold open-access model by paying a non-fully open journal to make a paper accessible. Note, however, that the fees associated with hybrid open-access journals are detrimental to researchers at smaller institutions [39] and thus this model should not be viewed as the solution in keeping with the goals of the open-access movement. Any new research software developed should be done openly (e.g., on GitHub), released publicly under a permissive license, such as the BSD 3-clause license, and cited appropriately in any publications that rely on it [16]. The Git version-control system (or equivalent) should be used to track the history of software projects, and software releases associated with publications or data should be archived (with DOIs) using Zenodo. In addition, implementation details should be described as thoroughly as necessary to reproduce the work. Similarly, the design of any research hardware that is developed should be publicly released under a permissive license. When making use of existing research software and/or hardware, use FOSS/FOSH whenever possible to permit the greatest reproducibility. All data generated through research, when serving as the basis for a publication, should be archived publicly and cited appropriately in manuscripts or other documents [h]. This data may also include figures and the plotting scripts that produce them, which can then be shared under a CC BY license and cited where appropriate. As a means of supporting these efforts, researchers should take care to implement these policy statements by incorporating them into funding proposals, for example in Data Management Plans. Note that policies where data and code are made “available upon request” are generally not sufficient for reproducibility [40]. Several community efforts have developed in recent years with the goal of defining and supporting open science practices. Some examples include the Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE) series [41] and the FORCE11 Software Citation Working Group, which developed the Software Citation Principles [16] with the goal of standardizing software citation to help ensure authors/developers receive academic credit for their work in releasing open research software. On the publishing side, community-driven research journals have been built to promote open publishing practices. Some examples include the Journal of Open Source Software [i42] and the Journal of Open Engineering [j  43]. Similarly, engrXiv, an open archive for engineering publications, has been developed to serve the engineering community, inspired by the success of arXiv.

Challenges to performing open science

The primary challenges facing those individuals interested in conducting open research generally involve incentives (or the lack thereof) and restrictive policies maintained by traditional publishers, in addition to the lack of a culture of sharing within the researcher’s disciplinary field. First, researchers are often pressured to carefully consider the venue in which they publish their work and to select only those that are “well established” and “high impact.” However, if these venues are not amenable to open research activities such as the posting of preprints, these challenges disincentivize those activities. To remedy this, the research community must continue to pressure publishers to modify their copyright transfer policies. Some progress has already been made in this effort through policies from funding sources such as the National Institutes of Health [k], the National Science Foundation [l], the Bill & Melinda Gates Foundation [m], and the Wellcome Trust [n] as well as from research institutions who require deposition in a repository. More information on these policies can be found on the Registry of Open Access Repository Mandates and Policies [o]. Additionally, authors themselves can in some cases work with publishers to modify the standard publisher copyright transfer agreements allowing the author to retain more rights [p]. Additionally, promotion and tenure requirements typically focus exclusively on the final published manuscript and associated metrics, neglecting other research outputs such as code, data, solid models, etc., and their associated impacts. Some institutions actively discourage making these alternative research products available due to idealistic dreams of future income generation from licensing revenues. However, in reality, the majority of universities lose money through their technology transfer offices, since translation of university intellectual property to commercial success is generally poorly realized [44, 45]. Instead, institutions may pursue alternatives which promote universal knowledge dissemination as a mechanism to create impact from university research outcomes as opposed to monetary aims. Ultimately, it is likely that societal pressure is necessary to push more institutions to participate in such initiatives. For that to happen, the public first needs to be aware of the possible benefits of broad knowledge dissemination and needs to experience those benefits first hand. Researchers may even see benefits in terms of their scholarly productivity, as Frankenhuis and Nettle [46] argue that open-science practices may actually increase creativity and researcher output. The challenges impeding greater adoption of open-science practices are mainly institutional and cultural, rather than technical. General venues for sharing and developing the products of research openly abound these days, with the availability of services like arXiv, engrXiv, and PeerJ Preprints for ensuring open access of publications; repositories like GitHub for developing (and version-controlling) research software openly; and data and software archives like Zenodo and Figshare, which practically have no file size limitations [q]. Of course, some technical problems remain: How do we make results of computational science, particularly when it involves demanding high-performance computing resources, truly reproducible? How can we cite software and data consistently, when the version might change regularly? How can open practices be integrated into a researcher’s workflow without further straining the researcher’s already overburdened time? As cultural inertia and lack of institutional recognition/rewards pose significant challenges to increased openness in science, the biggest barrier to greater openness in research may be apathy in many research communities. Many academic researchers either disagree on or are unaware of the importance (and benefits) of working openly. Since they were not trained in doing this, e.g., during graduate school or during postdoctoral training, they also may simply be unaware of how to research openly, or the resources that are available to do so. Furthermore, since most of their colleagues, collaborators, and competitors do not practice open science, no pressure comes from the research community to change. In addition, some communities do not support or actively oppose activities such as submitting preprints. These challenges seem to be particularly prevalent in engineering, especially when compared with some sub-disciplines of physics where accessibility of scholarship is markedly higher [18]. It could be that the industry connections and strongly applied nature of engineering have hindered adoption of open practices. This lack of pressure is related to the other major issue: lack of institutional recognition and reward for open practices. Many academic researchers will focus on what gets them credit for promotion and tenure—anything beyond that requires strong intrinsic motivation, or external motivators from the research community. At most institutions, promotion and tenure review includes some judgment (whether explicit or implicit) of where faculty publish their work, but many, “high-impact” traditional publication venues—particularly domain journals—may not support, e.g., the posting of preprints. Along with pushing publishers to support the posting of preprints, progress may be made by reminding researchers about the citation advantages of open access publishing and open dissemination of data, software, and hardware [9, 17, 22– 24].

Recommendations for university leaders

As already discussed, there are real career advantages to open-access publishing and open dissemination of data, code, or other research products and therefore, for some, the incentives to conduct open research may already be in place. However, for many, citation metrics alone are not enough to ensure success in promotion and tenure, and therefore they must play to the norms of their field, department, and institution. Therefore, the institution (and the department) should look to institute policy that redefines how we measure success in academic engineering research. Some suggestions include focusing less on journal-level metrics and lending greater credibility to article-level metrics. For article-level metrics, go beyond the citation count and look for other evidence of research impact such as alternative metrics (tweets, blog posts, media coverage) and replication by others. Lastly, look for evidence of broader implications such as economic development, student development, or even lives saved. Encourage your researchers to aim for those broader impacts and value them greater than the publishing of one more paper. These broader impacts have real benefits in terms of institutional reputation, particularly among the general public where perception of the institution’s value can be tied to this increased societal impact. Thinking about what institutions can do to promote open engineering research and create support structures around open dissemination, we provide the following recommendations: Require research products to be made openly available and then support this requirement by having a high-quality institutional repository, supporting other open repositories, and lobbying publishers to modify their copyright policies to promote the publishing of preprints and other products prior to journal submission as well as archiving of final version manuscripts. Convert technology commercialization offices into research impact offices. Use these offices as a mechanism for helping researchers broaden their impact through open research best practices, for funding social entrepreneurship, and for advocating these institutional activities at the state, national, and international levels. Empower and fund our university libraries to help with open knowledge dissemination. Others have described ways in which research outputs can be pushed public in real time with the support of the library [47], institutions should promote and support these efforts. Educate our undergraduate and graduate students on the importance of open knowledge dissemination and the practices that support it. Create and sponsor workshops that train participants in open-source software development, open research dissemination, and global development. Many institutions embrace service learning as a mechanism for greater civic engagement [48]—broaden this approach in a thoughtful and impactful manner, being careful to ensure that students are learning the right lessons and that partnering communities are not unduly burdened [49]. These approaches can help ensure that young engineers remain passionate about the field and hold onto the core societal mission of engineering [50]. Thinking specifically about the perspective of the researcher within an institution, the following list of recommendations for departments are mostly targeted at changing criteria for promotion and tenure, and performance reviews, to encourage faculty to practice more open science: Consider accessibility/openness of research products along with quantity and “quality” in promotion and tenure review. Mandate self-archiving of publications (i.e., green open access). Recognize research products such as software and data, and their associated impacts (e.g., citations), as equal to traditional publications in scholarly impact. Reduce the importance of publishing in traditional venues for promotion and tenure, recognizing these may be barriers to openness. Provide educational opportunities that train faculty and other researchers in open science skills, and those necessary to work with software and data. Research communities that impede openness cannot be forced to change from the outside. Instead, by making changes to institutional reward systems, researchers will be encouraged to improve their open practices, and thus evolve communities from the inside.

Conclusions

In this paper we have reviewed the existing state of knowledge on the benefits and challenges of practicing openness in engineering research. We have further briefly outlined our thoughts on how open research practices in the sciences, engineering, and other fields can and should be employed by public universities to position themselves as centers for the creation and broad dissemination of knowledge as a public resource. Resistance to this proposal is prevalent through reluctance to change and, in some cases, apathy on the part of researchers. Additionally, many researchers operate in an environment that devalues an educated populace and with systemic practices and policies that exclusively reward the monetization of any form of intellectual property. Change likely needs to be driven with grassroots initiatives that demonstrate the possible benefits and make it clear that tax dollars could fund these efforts if distributed properly and with accountability. However, many of the recommendations provided here would require little or no additional funding as the mechanisms that would enable them, such as recognition of diverse research products in hiring and tenure and promotion criteria, already exist as part of normal academic routines. I read "The case for openness in engineering research" and found it to provide a good overview of the state of openness in general. My main remark for the authors is that I felt like I was familiar with the contents of the opinion piece, despite having zero knowledge about engineering research. Maybe the case for openness in engineering research is simply the case for openness, give a few field-specific details? I'd expected more differentiation, given the statement that it would be focused on engineering. All in all, I found little to nothing to actively disagree with in respect to factual claims. One particular aspect that might be specific to engineering research are patents. Public disclosure can preclude an invention from patenting, if it takes longer than 12 months for the patent to be filed after initial public disclosure of the invention. However, patents are relatively rarely discussed in the open space compared to licensing, so I don't know whether this would fit in or not. It might also be relevant to refer the "Patent trolling" scenario to the authors for this. https://www.eff.org/reclaim-invention/pledge Some minor aspects: In the "Accessibility:" paragraph, licensing isn't mentioned. For something to remain accessible, permissive licensing is necessary (otherwise it may disappear at any point). "Reproducibility:" helps understand, not enable, reproducibility in my perspective but I'm nit-picking (this is a good sign because there's hardly anything else to comment on). "Establish priority:" is a good section. I usually refer to this as the Scooping Paradox, which helps people remember. Never wrote it down, so if you like it, it's yours. Regretfully, PeerJ preprints will stop accepting new preprints September 30th, which may (or may not) affect the authors' willingness to include it as a recommendation. https://twitter.com/jasonHoyt/status/1168809204062269442 I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard. The authors make a clear case for more openness in engineering research. They give a well-balanced set of arguments for researchers to perform open research, and argue that it increases the societal impact of the research. They also discuss potential obstacles that researchers may encounter when making their work more open. Finally, they also give a set of recommendations for university leaders to improve and support openness. I have some minor questions/comments, which I hope can help in further strengthening the points made in the article.  As already mentioned above, these comments were mostly my views about some aspects of the present opinion article. In general, I think it is already a well-written article and I would like to thank the authors for their efforts in putting this opinion article together to further promote open science. The funding paragraph in the introduction is currently very much focused on US public universities, which of course does not cover the entire readership. The arguments provided in this article mostly hold for publicly funded academic research in engineering. A lot of engineering research is also performed in industry, or at universities (partially) funded by industry. In such projects, contracts often do not allow full openness of publication, code and data. It could be a useful addition to mention this (more) clearly in the paper. Regarding my work on citation analysis of image processing papers with and without code, I recently presented an update of this analysis at the WIC Symposium (see Vandewalle, 2019 [1]). Just providing this information for your reference. I do not fully agree with the argument of establishing priority by releasing preprints to avoid being "scooped". This is true, but when researchers fear to be "scooped", I believe they generally fear that others will be faster in their further exploration of a data set or a piece of code and hence do not allow the original authors to take full advantage of their initial efforts. Meanwhile, I believe the risk of being "scooped" is generally overestimated and does (in most cases) not provide a real threat. In the section "It's nice", the authors argue that it prevents unnecessary repetition of work. I fully agree with this argument, and see this improved efficiency as one of the strong benefits of more openness. Meanwhile, there are also some advantages to such an independent repetition of an experiment, as it reduces the risk of repeating or overlooking the same bugs. It could be useful to add a comment in that sense to the article to keep arguments well-balanced. Finally, in the introduction to "Open science and the scholar's research agenda", the authors mention that "sometimes it is enough to simply be more open than the current norms in their field." While this makes sense for researchers wanting to get some benefits of being open by doing somewhat better than their peers, I believe we should set sufficiently high standards for open research, and at least aim for full openness where possible. I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard. The authors have revised the article to address most of my prior comments/concerns and that of Dr. Pearce, the other official reviewer. In particular, the addition of references, addition of text (e.g., a new “value and credit” subsection), and wording revisions have strengthened the work. The article is an important contribution to the literature on open science. The article should promote additional dialogue on this important topic. I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard. This is an important paper - and sadly still needs to be written in 2018 - as open science should clearly be the default for the maximum rate of progress in any field. It is clear that we are headed this way but the rate of change could certainly be faster. In addition to the comments from the other reviewers I offer the following constructive points: Although this is an opinion piece I would encourage the authors to avoid all unsubstantiated claims. Ideally every fact not derived from the authors' own work should have a citation. Although I agree with the first line - the article would be stronger if the footnote offered either a substantial list of referenced arguments for it -- or wait until the end to make the claim. Instead of simply saying open source software - it would be better to use FOSS (and explain the difference). To be correct in the definition of open science - the use of free and open source hardware should be included in addition to FOSS. I am not sure that open science is benefited by "the half-baked ideas, the napkin sketches, the first drafts, and the failures." For example, engineers working on water purification technology should not publish their tech until they are sure it works unless it has warning notices all over it. It would be interesting to speculate on why engineers lag so far behind say physicists in making their work open access. Is it because the various engineering societies have more restrictive publishing agreements than the major publishers? There are some examples of method to quantify the impact of research on society. There is a rich literature showing high ROIs for industry funded research for the business world. In addition using the concept of downloaded substitution value one can calculate the value to society for open source scientific hardware designs as well as software. Increasing societal impact of research + you make a good point about the scientists in the Global South - but you should consider going one step further and encouraging engineers working on technologies that can solve the problems of the world's poorest people to make sure they are released as open source appropriate technologies. Impact - you can make a stronger case. Patenting slows innovation (and there are a ton of studies showing this) and increases costs for consumers. One of the most clear recent examples is the staggering decrease in costs and increases in performance of 3D printers following the open source release of the RepRap project. Engineering journals + most of them do offer some sort of open access policy - either to post preprints or pay for open access. Open workflow should encourage the use of FOSS and FOSH whenever possible. pg 5 seems overly pessimistic and also does not cite proof. I have found that most researchers are sympathetic to open science and that many forward-thinking institutions are pushing open access pretty hard. Even just going to article level metrics - where OA has an advantage for citations should be useful and already has a built in incentive. Something should be said about open source business models. Patents are not the only way to go -- i.e. RedHat is a multi-billion per year open source company. Finally, many of your suggestions are good but could be strengthened if you hone in on university leaders self interest for encouraging them to actually implement them. I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above. Dr. Pearce, Thank you for your thorough review. We agree that open science practices should indeed be the default for engineering research and progress. We'd like to address your individual points below. 1) Although this is an opinion piece I would encourage the authors to avoid all unsubstantiated claims. Ideally every fact not derived from the authors' own work should have a citation. We have gone through the paper again and attempted to address any issues with uncited statements. Several new citations and references have been added. 2) Although I agree with the first line - the article would be stronger if the footnote offered either a substantial list of referenced arguments for it -- or wait until the end to make the claim. The first line is stated as an opinion, which we intend to use as the basis for the rest of the paper. This statement is fleshed out more fully with references to other related works in the following paragraphs. 3) Instead of simply saying open source software - it would be better to use FOSS (and explain the difference). We have added language to make this distinction. 4) To be correct in the definition of open science - the use of free and open source hardware should be included in addition to FOSS. Agreed. We have added hardware to this definition. 5) I am not sure that open science is benefited by "the half-baked ideas, the napkin sketches, the first drafts, and the failures." For example, engineers working on water purification technology should not publish their tech until they are sure it works unless it has warning notices all over it. We agree with the first sentence here and have modified the language to word this differently. For the second sentence, we agree that there needs to be more education on what early work is and its use cases. However, we don't feel that this is limited to any one sub-discipline such as "water purification technology" and is actually an issue that applies to virtually all of engineering. 6) It would be interesting to speculate on why engineers lag so far behind say physicists in making their work open access. Is it because the various engineering societies have more restrictive publishing agreements than the major publishers? This is a difficult question to answer. Perhaps there are cultural limitations due to the applied or industry connected nature of engineering. We've added some discussion to the "Challenges" section to try to address this better. 7) There are some examples of method to quantify the impact of research on society. There is a rich literature showing high ROIs for industry funded research for the business world. In addition using the concept of downloaded substitution value one can calculate the value to society for open source scientific hardware designs as well as software. We have added some discussion and relevant citations to the societal impact section of the paper. 8) Increasing societal impact of research + you make a good point about the scientists in the Global South - but you should consider going one step further and encouraging engineers working on technologies that can solve the problems of the world's poorest people to make sure they are released as open source appropriate technologies. Agree and we have added some additional discussion to this section of the paper. 9) Impact - you can make a stronger case. Patenting slows innovation (and there are a ton of studies showing this) and increases costs for consumers. One of the most clear recent examples is the staggering decrease in costs and increases in performance of 3D printers following the open source release of the RepRap project. Have added some discussion and references to support this. 10) Engineering journals + most of them do offer some sort of open access policy - either to post preprints or pay for open access. We have to generally disagree with this point. Several of the larger engineering societies are lagging in explicit support of preprinting. Additionally, we don't view hybrid OA journals as a solution to this problem as they are largely shifting the fees and associated publisher profits from one hand to the other. We view hybrid OA journals as a temporary solution on the path towards full OA journals with reasonable APCs operated by societies or other non-profit academic institutions. 11) Open workflow should encourage the use of FOSS and FOSH whenever possible. Have edited the language to match. 12) pg 5 seems overly pessimistic and also does not cite proof. I have found that most researchers are sympathetic to open science and that many forward-thinking institutions are pushing open access pretty hard. We have edited the language here. Also, it seems that these forward-thinking institutions are pushing OA publishing but not open science. 13) Even just going to article level metrics - where OA has an advantage for citations should be useful and already has a built in incentive. True, we have reiterated this point. 14) Something should be said about open source business models. Patents are not the only way to go, RedHat is a multi-billion per year open source company. This is true, though we've tried to focus this paper more on research than industry/innovation. 15) Finally, many of your suggestions are good but could be strengthened if you hone in on university leaders self interest for encouraging them to actually implement them. Have edited language to reiterate the benefits of self-interest. This is a well-written opinion article that presents a well-articulated argument for the science community to increase the practice of “openness” in research. After clearly defining  “open” research (caveat below), key components of the article include a discussion on the importance of open research to science, some recommendations on how researchers can conduct their research “openly” and how universities can support such work, and there is a discussion about challenges to open science. The article is a valuable contribution to this growing movement. Although the article is well written and the argument is clearly articulated with appropriate references to the literature, I believe that there are a few points that the authors should consider and respond to prior to this work being approved for indexing. The authors reference the work as a “review” but in the parlance of F1000Research, it is an opinion article and it is written as such since it is an argument for conducting more open research. Thus, this reviewer suggests not referring to the work as a “review.” The article title and a few areas within the text are written specifically toward engineering research, but this topic transcends disciplines and much of the content can be applied broadly. As such, the authors should consider whether the title and respective text should be less restricted to engineering. Perhaps the authors could present the topic as one that is more broadly applicable but then more clearly state that they are using engineering as a case study/example to illustrate their points. The authors begin the main text with a clear definition of open science, but it is unclear if this is the authors’ definition or if it is taken from the literature. This should be clarified. The authors should consider adding a bit more discussion on the challenges associated with open research. For example, more could be said about how the culture of academia doesn’t particularly value open research currently. Today’s academic culture is still too focused on where a journal is ranked based on impact factor and this plays into aspects of faculty life such as faculty hiring, promotion/tenure, faculty performance evaluations, post-tenure review, etc.  While the authors touch on some of these points, a greater discussion and recommendation for a culture change could be a valuable, thought-provoking addition to the article. More could also be said regarding some of the minutia of how open research needs to be implemented especially related to the infrastructure needed to support it and the financial costs associated with the infrastructure. Who should pay for hosting, pay publications costs, etc.? In summary, this is a well-articulated article on an important, timely topic. The few points above slightly dampen this reviewer’s enthusiasm at this time related to giving full approval. As such, I look forward to reviewing a revised version of the article. I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above. Dr. Vanderford, Thank you for your thoughtful review of our paper and for your specific suggestions for improvement. We've attempted to address your comments in this revision as outlined below. 1. The authors reference the work as a “review” but in the parlance of F1000Research, it is an opinion article and it is written as such since it is an argument for conducting more open research. Thus, this reviewer suggests not referring to the work as a “review.” We've revised the language to make it more clear that this is not a review article and more of an opinion article. 2) The article title and a few areas within the text are written specifically toward engineering research, but this topic transcends disciplines and much of the content can be applied broadly. As such, the authors should consider whether the title and respective text should be less restricted to engineering. Perhaps the authors could present the topic as one that is more broadly applicable but then more clearly state that they are using engineering as a case study/example to illustrate their points. We agree that many of the issues that we've addressed are not unique to engineering and do indeed apply more broadly. However, our goal with this work was to write a targeted paper for the engineering community that specifically addresses the issues from that frame. There are other works in the literature that take a more general approach and we've tried to reference them appropriately to bring them to the reader's attention. 3) The authors begin the main text with a clear definition of open science, but it is unclear if this is the authors’ definition or if it is taken from the literature. This should be clarified. The definition of open science included in our paper is a synthesis of other available definitions found elsewhere. We've edited to make this more clear and inserted some references to support this. 4) The authors should consider adding a bit more discussion on the challenges associated with open research. For example, more could be said about how the culture of academia doesn’t particularly value open research currently. Today’s academic culture is still too focused on where a journal is ranked based on impact factor and this plays into aspects of faculty life such as faculty hiring, promotion/tenure, faculty performance evaluations, post-tenure review, etc. While the authors touch on some of these points, a greater discussion and recommendation for a culture change could be a valuable, thought-provoking addition to the article. More could also be said regarding some of the minutia of how open research needs to be implemented especially related to the infrastructure needed to support it and the financial costs associated with the infrastructure. Who should pay for hosting, pay publications costs, etc.? We have addressed issues around a lack of valuation for open science practices and the need for change in promotion/tenure evaluation criteria. We've tried to emphasize this more in the paper. We would also point out that many of our recommendations do not require financial support to be feasible and indeed would be possible under existing funding models.
  17 in total

1.  Journals should publish all "null" results and should sparingly publish "positive" results.

Authors:  John P A Ioannidis
Journal:  Cancer Epidemiol Biomarkers Prev       Date:  2006-01       Impact factor: 4.254

2.  SCIENTIFIC COMMUNITY. Preprints for the life sciences.

Authors:  Jeremy M Berg; Needhi Bhalla; Philip E Bourne; Martin Chalfie; David G Drubin; James S Fraser; Carol W Greider; Michael Hendricks; Chonnettia Jones; Robert Kiley; Susan King; Marc W Kirschner; Harlan M Krumholz; Ruth Lehmann; Maria Leptin; Bernd Pulverer; Brooke Rosenzweig; John E Spiro; Michael Stebbins; Carly Strasser; Sowmya Swaminathan; Paul Turner; Ronald D Vale; K VijayRaghavan; Cynthia Wolberger
Journal:  Science       Date:  2016-05-20       Impact factor: 47.728

3.  An empirical analysis of journal policy effectiveness for computational reproducibility.

Authors:  Victoria Stodden; Jennifer Seiler; Zhaokun Ma
Journal:  Proc Natl Acad Sci U S A       Date:  2018-03-12       Impact factor: 11.205

4.  The Oligopoly of Academic Publishers in the Digital Era.

Authors:  Vincent Larivière; Stefanie Haustein; Philippe Mongeon
Journal:  PLoS One       Date:  2015-06-10       Impact factor: 3.240

5.  How open science helps researchers succeed.

Authors:  Erin C McKiernan; Philip E Bourne; C Titus Brown; Stuart Buck; Amye Kenall; Jennifer Lin; Damon McDougall; Brian A Nosek; Karthik Ram; Courtney K Soderberg; Jeffrey R Spies; Kaitlin Thaney; Andrew Updegrove; Kara H Woo; Tal Yarkoni
Journal:  Elife       Date:  2016-07-07       Impact factor: 8.140

6.  The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles.

Authors:  Heather Piwowar; Jason Priem; Vincent Larivière; Juan Pablo Alperin; Lisa Matthias; Bree Norlander; Ashley Farley; Jevin West; Stefanie Haustein
Journal:  PeerJ       Date:  2018-02-13       Impact factor: 2.984

7.  Imagining tomorrow's university in an era of open science.

Authors:  Adina Howe; Michael Howe; Amy L Kaleita; D Raj Raman
Journal:  F1000Res       Date:  2017-03-31

8.  Imagining the "open" university: Sharing scholarship to improve research and education.

Authors:  Erin C McKiernan
Journal:  PLoS Biol       Date:  2017-10-24       Impact factor: 8.029

9.  Rampant software errors may undermine scientific results.

Authors:  David A W Soergel
Journal:  F1000Res       Date:  2014-12-11

10.  Open Science Is Liberating and Can Foster Creativity.

Authors:  Willem E Frankenhuis; Daniel Nettle
Journal:  Perspect Psychol Sci       Date:  2018-07
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.