Literature DB >> 33267341

Smooth Function Approximation by Deep Neural Networks with General Activation Functions.

Ilsang Ohn1, Yongdai Kim1.   

Abstract

There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class of activation functions. This class of activation functions includes most of frequently used activation functions. We derive the required depth, width and sparsity of a deep neural network to approximate any Hölder smooth function upto a given approximation error for the large class of activation functions. Based on our approximation error analysis, we derive the minimax optimality of the deep neural network estimators with the general activation functions in both regression and classification problems.

Entities:  

Keywords:  Hölder continuity; activation functions; convergence rates; deep neural networks; function approximation

Year:  2019        PMID: 33267341      PMCID: PMC7515121          DOI: 10.3390/e21070627

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


  3 in total

Review 1.  Deep learning.

Authors:  Yann LeCun; Yoshua Bengio; Geoffrey Hinton
Journal:  Nature       Date:  2015-05-28       Impact factor: 49.962

2.  Error bounds for approximations with deep ReLU networks.

Authors:  Dmitry Yarotsky
Journal:  Neural Netw       Date:  2017-07-13

3.  Optimal approximation of piecewise smooth functions using deep ReLU neural networks.

Authors:  Philipp Petersen; Felix Voigtlaender
Journal:  Neural Netw       Date:  2018-09-07
  3 in total
  1 in total

1.  Development of an IoT Architecture Based on a Deep Neural Network against Cyber Attacks for Automated Guided Vehicles.

Authors:  Mahmoud Elsisi; Minh-Quang Tran
Journal:  Sensors (Basel)       Date:  2021-12-18       Impact factor: 3.576

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.