Literature DB >> 11802913

On the complexity of computing and learning with multiplicative neural networks.

Michael Schmitt1.   

Abstract

In a great variety of neuron models, neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units that multiply their inputs instead of summing them and thus allow inputs to interact nonlinearly. The class of multiplicative neural networks comprises such widely known and well-studied network types as higher-order networks and product unit networks. We investigate the complexity of computing and learning for multiplicative neural networks. In particular, we derive upper and lower bounds on the Vapnik-Chervonenkis (VC) dimension and the pseudo-dimension for various types of networks with multiplicative units. As the most general case, we consider feedforward networks consisting of product and sigmoidal units, showing that their pseudo-dimension is bounded from above by a polynomial with the same order of magnitude as the currently best-known bound for purely sigmoidal networks. Moreover, we show that this bound holds even when the unit type, product or sigmoidal, may be learned. Crucial for these results are calculations of solution set components bounds for new network classes. As to lower bounds, we construct product unit networks of fixed depth with super-linear VC dimension. For sigmoidal networks of higher order, we establish polynomial bounds that, in contrast to previous results, do not involve any restriction of the network order. We further consider various classes of higher-order units, also known as sigma-pi units, that are characterized by connectivity constraints. In terms of these, we derive some asymptotically tight bounds. Multiplication plays an important role in both neural modeling of biological behavior and computing and learning with artificial neural networks. We briefly survey research in biology and in applications where multiplication is considered an essential computational element. The results we present here provide new tools for assessing the impact of multiplication on the computational power and the learning capabilities of neural networks.

Mesh:

Year:  2002        PMID: 11802913     DOI: 10.1162/08997660252741121

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  4 in total

1.  Heaviness perception. IV. Weight x aperture -1 as a heaviness model in finger-grasp perception.

Authors:  Satoru Kawai
Journal:  Exp Brain Res       Date:  2003-09-12       Impact factor: 1.972

2.  Single Neuron for Solving XOR like Nonlinear Problems.

Authors:  Ashutosh Mishra; Jaekwang Cha; Shiho Kim
Journal:  Comput Intell Neurosci       Date:  2022-04-28

3.  An Evolutionary Field Theorem: Evolutionary Field Optimization in Training of Power-Weighted Multiplicative Neurons for Nitrogen Oxides-Sensitive Electronic Nose Applications.

Authors:  Baris Baykant Alagoz; Ozlem Imik Simsek; Davut Ari; Aleksei Tepljakov; Eduard Petlenkov; Hossein Alimohammadi
Journal:  Sensors (Basel)       Date:  2022-05-18       Impact factor: 3.847

4.  Using machine learning methods to determine a typology of patients with HIV-HCV infection to be treated with antivirals.

Authors:  Antonio Rivero-Juárez; David Guijo-Rubio; Francisco Tellez; Rosario Palacios; Dolores Merino; Juan Macías; Juan Carlos Fernández; Pedro Antonio Gutiérrez; Antonio Rivero; César Hervás-Martínez
Journal:  PLoS One       Date:  2020-01-10       Impact factor: 3.240

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.