Literature DB >> 21652285

Sparse neural networks with large learning diversity.

Vincent Gripon1, Claude Berrou.   

Abstract

Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages that are much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory.

Mesh:

Year:  2011        PMID: 21652285     DOI: 10.1109/TNN.2011.2146789

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  2 in total

1.  Technical note: an R package for fitting sparse neural networks with application in animal breeding.

Authors:  Yangfan Wang; Xue Mi; Guilherme J M Rosa; Zhihui Chen; Ping Lin; Shi Wang; Zhenmin Bao
Journal:  J Anim Sci       Date:  2018-05-04       Impact factor: 3.159

2.  Robust Exponential Memory in Hopfield Networks.

Authors:  Christopher J Hillar; Ngoc M Tran
Journal:  J Math Neurosci       Date:  2018-01-16       Impact factor: 1.300

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.