Literature DB >> 34000562

Statistical guarantees for regularized neural networks.

Mahsa Taheri1, Fang Xie2, Johannes Lederer3.   

Abstract

Neural networks have become standard tools in the analysis of data, but they lack comprehensive mathematical theories. For example, there are very few statistical guarantees for learning neural networks from data, especially for classes of estimators that are used in practice or at least similar to such. In this paper, we develop a general statistical guarantee for estimators that consist of a least-squares term and a regularizer. We then exemplify this guarantee with ℓ1-regularization, showing that the corresponding prediction error increases at most logarithmically in the total number of parameters and can even decrease in the number of layers. Our results establish a mathematical basis for regularized estimation of neural networks, and they deepen our mathematical understanding of neural networks and deep learning more generally.
Copyright © 2021 Elsevier Ltd. All rights reserved.

Keywords:  Deep learning; Neural networks; Prediction guarantees; Regularization

Year:  2021        PMID: 34000562     DOI: 10.1016/j.neunet.2021.04.034

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  1 in total

1.  Analytic Function Approximation by Path-Norm-Regularized Deep Neural Networks.

Authors:  Aleksandr Beknazaryan
Journal:  Entropy (Basel)       Date:  2022-08-16       Impact factor: 2.738

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.