Literature DB >> 33716359

IPAD: Stable Interpretable Forecasting with Knockoffs Inference.

Yingying Fan1, Jinchi Lv1, Mahrad Sharifvaghefi1, Yoshimasa Uematsu2.   

Abstract

Interpretability and stability are two important features that are desired in many contemporary big data applications arising in statistics, economics, and finance. While the former is enjoyed to some extent by many existing forecasting approaches, the latter in the sense of controlling the fraction of wrongly discovered features which can enhance greatly the interpretability is still largely underdeveloped. To this end, in this paper we exploit the general framework of model-X knockoffs introduced recently in Candès, Fan, Janson and Lv (2018), which is nonconventional for reproducible large-scale inference in that the framework is completely free of the use of p-values for significance testing, and suggest a new method of intertwined probabilistic factors decoupling (IPAD) for stable interpretable forecasting with knockoffs inference in high-dimensional models. The recipe of the method is constructing the knockoff variables by assuming a latent factor model that is exploited widely in economics and finance for the association structure of covariates. Our method and work are distinct from the existing literature in that we estimate the covariate distribution from data instead of assuming that it is known when constructing the knockoff variables, our procedure does not require any sample splitting, we provide theoretical justifications on the asymptotic false discovery rate control, and the theory for the power analysis is also established. Several simulation examples and the real data analysis further demonstrate that the newly suggested method has appealing finite-sample performance with desired interpretability and stability compared to some popularly used forecasting methods.

Entities:  

Keywords:  Large-scale inference and FDR; Latent factors; Model-X knockoffs; Power; Reproducibility; Stability

Year:  2019        PMID: 33716359      PMCID: PMC7954402          DOI: 10.1080/01621459.2019.1654878

Source DB:  PubMed          Journal:  J Am Stat Assoc        ISSN: 0162-1459            Impact factor:   5.033


  4 in total

1.  High Dimensional Classification Using Features Annealed Independence Rules.

Authors:  Jianqing Fan; Yingying Fan
Journal:  Ann Stat       Date:  2008       Impact factor: 4.028

2.  Estimating False Discovery Proportion Under Arbitrary Covariance Dependence.

Authors:  Jianqing Fan; Xu Han; Weijie Gu
Journal:  J Am Stat Assoc       Date:  2012       Impact factor: 5.033

3.  Nonuniformity of P-values Can Occur Early in Diverging Dimensions.

Authors:  Yingying Fan; Emre Demirkaya; Jinchi Lv
Journal:  J Mach Learn Res       Date:  2019       Impact factor: 5.177

  4 in total
  3 in total

1.  DeepLINK: Deep learning inference using knockoffs with applications to genomics.

Authors:  Zifan Zhu; Yingying Fan; Yinfei Kong; Jinchi Lv; Fengzhu Sun
Journal:  Proc Natl Acad Sci U S A       Date:  2021-09-07       Impact factor: 11.205

2.  Asymptotic Theory of Eigenvectors for Random Matrices with Diverging Spikes.

Authors:  Jianqing Fan; Yingying Fan; Xiao Han; Jinchi Lv
Journal:  J Am Stat Assoc       Date:  2020-12-08       Impact factor: 4.369

3.  Null-free False Discovery Rate Control Using Decoy Permutations.

Authors:  Kun He; Meng-Jie Li; Yan Fu; Fu-Zhou Gong; Xiao-Ming Sun
Journal:  Acta Math Appl Sin       Date:  2022-04-09       Impact factor: 1.102

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.