Literature DB >> 28482227

Probabilistic lower bounds for approximation by shallow perceptron networks.

Věra Kůrková1, Marcello Sanguineti2.   

Abstract

Limitations of approximation capabilities of shallow perceptron networks are investigated. Lower bounds on approximation errors are derived for binary-valued functions on finite domains. It is proven that unless the number of network units is sufficiently large (larger than any polynomial of the logarithm of the size of the domain) a good approximation cannot be achieved for almost any uniformly randomly chosen function on a given domain. The results are obtained by combining probabilistic Chernoff-Hoeffding bounds with estimates of the sizes of sets of functions exactly computable by shallow networks with increasing numbers of units.
Copyright © 2017 Elsevier Ltd. All rights reserved.

Keywords:  Chernoff–Hoeffding bounds; Lower bounds on approximation rates; Model complexity; Perceptrons; Shallow networks

Mesh:

Year:  2017        PMID: 28482227     DOI: 10.1016/j.neunet.2017.04.003

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  3 in total

Review 1.  Blessing of dimensionality: mathematical foundations of the statistical physics of data.

Authors:  A N Gorban; I Y Tyukin
Journal:  Philos Trans A Math Phys Eng Sci       Date:  2018-04-28       Impact factor: 4.226

2.  Universal approximation with quadratic deep networks.

Authors:  Fenglei Fan; Jinjun Xiong; Ge Wang
Journal:  Neural Netw       Date:  2020-01-18

3.  The use of back propagation neural networks and 18F-Florbetapir PET for early detection of Alzheimer's disease using Alzheimer's Disease Neuroimaging Initiative database.

Authors:  Ilker Ozsahin; Boran Sekeroglu; Greta S P Mok
Journal:  PLoS One       Date:  2019-12-26       Impact factor: 3.240

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.