| Literature DB >> 28482227 |
Věra Kůrková1, Marcello Sanguineti2.
Abstract
Limitations of approximation capabilities of shallow perceptron networks are investigated. Lower bounds on approximation errors are derived for binary-valued functions on finite domains. It is proven that unless the number of network units is sufficiently large (larger than any polynomial of the logarithm of the size of the domain) a good approximation cannot be achieved for almost any uniformly randomly chosen function on a given domain. The results are obtained by combining probabilistic Chernoff-Hoeffding bounds with estimates of the sizes of sets of functions exactly computable by shallow networks with increasing numbers of units.Keywords: Chernoff–Hoeffding bounds; Lower bounds on approximation rates; Model complexity; Perceptrons; Shallow networks
Mesh:
Year: 2017 PMID: 28482227 DOI: 10.1016/j.neunet.2017.04.003
Source DB: PubMed Journal: Neural Netw ISSN: 0893-6080