| Literature DB >> 32054788 |
Bin Yu1,2,3,4, Karl Kumbier5.
Abstract
Building and expanding on principles of statistics, machine learning, and scientific inquiry, we propose the predictability, computability, and stability (PCS) framework for veridical data science. Our framework, composed of both a workflow and documentation, aims to provide responsible, reliable, reproducible, and transparent results across the data science life cycle. The PCS workflow uses predictability as a reality check and considers the importance of computation in data collection/storage and algorithm design. It augments predictability and computability with an overarching stability principle. Stability expands on statistical uncertainty considerations to assess how human judgment calls impact data results through data and model/algorithm perturbations. As part of the PCS workflow, we develop PCS inference procedures, namely PCS perturbation intervals and PCS hypothesis testing, to investigate the stability of data results relative to problem formulation, data cleaning, modeling decisions, and interpretations. We illustrate PCS inference through neuroscience and genomics projects of our own and others. Moreover, we demonstrate its favorable performance over existing methods in terms of receiver operating characteristic (ROC) curves in high-dimensional, sparse linear model simulations, including a wide range of misspecified models. Finally, we propose PCS documentation based on R Markdown or Jupyter Notebook, with publicly available, reproducible codes and narratives to back up human choices made throughout an analysis. The PCS workflow and documentation are demonstrated in a genomics case study available on Zenodo.Entities:
Keywords: computation; data science; prediction; stability
Year: 2020 PMID: 32054788 PMCID: PMC7049126 DOI: 10.1073/pnas.1901326117
Source DB: PubMed Journal: Proc Natl Acad Sci U S A ISSN: 0027-8424 Impact factor: 11.205
Fig. 1.The data science life cycle. The PCS workflow considers predictability, computability, and stability at every step, with a strong emphasis on stability.
Fig. 2.ROC curves showing true positive rate (TPR) and false positive rate (FPR) for feature selection in a linear model setting with observations. Each plot corresponds to a different generative model. Color corresponds to method of inference: red, PCS; blue, selective inference; green, linear model asymptotic normality. Error bars give the 10th and 90th percentiles over replicates.
Fig. 3.Assumptions made throughout the DSLC allow researchers to use models such as decision trees, neural networks, or probability distributions as an approximation of reality, which may include physical, chemical, or biological laws. Narratives provided in PCS documentation can help justify assumptions to connect these two worlds.