Literature DB >> 33736039

Activation function dependence of the storage capacity of treelike neural networks.

Jacob A Zavatone-Veth1, Cengiz Pehlevan2,3.   

Abstract

The expressive power of artificial neural networks crucially depends on the nonlinearity of their activation functions. Though a wide variety of nonlinear activation functions have been proposed for use in artificial neural networks, a detailed understanding of their role in determining the expressive power of a network has not emerged. Here, we study how activation functions affect the storage capacity of treelike two-layer networks. We relate the boundedness or divergence of the capacity in the infinite-width limit to the smoothness of the activation function, elucidating the relationship between previously studied special cases. Our results show that nonlinearity can both increase capacity and decrease the robustness of classification, and provide simple estimates for the capacity of networks with several commonly used activation functions. Furthermore, they generate a hypothesis for the functional benefit of dendritic spikes in branched neurons.

Year:  2021        PMID: 33736039     DOI: 10.1103/PhysRevE.103.L020301

Source DB:  PubMed          Journal:  Phys Rev E        ISSN: 2470-0045            Impact factor:   2.529


  1 in total

1.  Nonideality-Aware Training for Accurate and Robust Low-Power Memristive Neural Networks.

Authors:  Dovydas Joksas; Erwei Wang; Nikolaos Barmpatsalos; Wing H Ng; Anthony J Kenyon; George A Constantinides; Adnan Mehonic
Journal:  Adv Sci (Weinh)       Date:  2022-05-04       Impact factor: 17.521

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.