Literature DB >> 33816811

Verifiability in computer-aided research: the role of digital scientific notations at the human-computer interface.

Konrad Hinsen1,2.   

Abstract

Most of today's scientific research relies on computers and software for processing scientific information. Examples of such computer-aided research are the analysis of experimental data or the simulation of phenomena based on theoretical models. With the rapid increase of computational power, scientific software has integrated more and more complex scientific knowledge in a black-box fashion. As a consequence, its users do not know, and do not even have a chance of finding out, which assumptions and approximations their computations are based on. This black-box nature of scientific software has made the verification of much computer-aided research close to impossible. The present work starts with an analysis of this situation from the point of view of human-computer interaction in scientific research. It identifies the key role of digital scientific notations at the human-computer interface, reviews the most popular ones in use today, and describes a proof-of-concept implementation of Leibniz, a language designed as a verifiable digital scientific notation for models formulated as mathematical equations. ©2018 Hinsen.

Entities:  

Keywords:  Computational documents; Computational science; Digital scientific notations; Human-computer interaction; Validation; Verification

Year:  2018        PMID: 33816811      PMCID: PMC7924627          DOI: 10.7717/peerj-cs.158

Source DB:  PubMed          Journal:  PeerJ Comput Sci        ISSN: 2376-5992


  12 in total

1.  Scientific publishing. A scientist's nightmare: software problem leads to five retractions.

Authors:  Greg Miller
Journal:  Science       Date:  2006-12-22       Impact factor: 47.728

2.  Ten simple rules for making research software more robust.

Authors:  Morgan Taschuk; Greg Wilson
Journal:  PLoS Comput Biol       Date:  2017-04-13       Impact factor: 4.475

3.  Five retracted structure reports: inverted or incorrect?

Authors:  Brian W Matthews
Journal:  Protein Sci       Date:  2007-05-01       Impact factor: 6.725

4.  Real Cost of Speed: The Effect of a Time-Saving Multiple-Time-Stepping Algorithm on the Accuracy of Molecular Dynamics Simulations.

Authors:  Sabine Reißer; David Poger; Martin Stroet; Alan E Mark
Journal:  J Chem Theory Comput       Date:  2017-05-26       Impact factor: 6.006

5.  Cluster failure: Why fMRI inferences for spatial extent have inflated false-positive rates.

Authors:  Anders Eklund; Thomas E Nichols; Hans Knutsson
Journal:  Proc Natl Acad Sci U S A       Date:  2016-06-28       Impact factor: 11.205

6.  Enhancing reproducibility for computational methods.

Authors:  Victoria Stodden; Marcia McNutt; David H Bailey; Ewa Deelman; Yolanda Gil; Brooks Hanson; Michael A Heroux; John P A Ioannidis; Michela Taufer
Journal:  Science       Date:  2016-12-09       Impact factor: 47.728

7.  Does your code stand up to scrutiny?

Authors: 
Journal:  Nature       Date:  2018-03-08       Impact factor: 49.962

8.  Lessons learned from comparing molecular dynamics engines on the SAMPL5 dataset.

Authors:  Michael R Shirts; Christoph Klein; Jason M Swails; Jian Yin; Michael K Gilson; David L Mobley; David A Case; Ellen D Zhong
Journal:  J Comput Aided Mol Des       Date:  2016-10-27       Impact factor: 3.686

9.  Computational science: shifting the focus from tools to models.

Authors:  Konrad Hinsen
Journal:  F1000Res       Date:  2014-05-07

10.  Rampant software errors may undermine scientific results.

Authors:  David A W Soergel
Journal:  F1000Res       Date:  2014-12-11
View more
  1 in total

Review 1.  Documenting research software in engineering science.

Authors:  Sibylle Hermann; Jörg Fehr
Journal:  Sci Rep       Date:  2022-04-21       Impact factor: 4.996

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.