Literature DB >> 29220307

From Winner-Takes-All to Winners-Share-All: Exploiting the Information Capacity in Temporal Codes.

Melika Payvand1, Luke Theogarajan2.   

Abstract

In this letter, we have implemented and compared two neural coding algorithms in the networks of spiking neurons: Winner-takes-all (WTA) and winners-share-all (WSA). Winners-Share-All exploits the code space provided by the temporal code by training a different combination of [Formula: see text] out of [Formula: see text] neurons to fire together in response to different patterns, while WTA uses a one-hot-coding to respond to distinguished patterns. Using WSA, the maximum value of [Formula: see text] in order to maximize information capacity using [Formula: see text] output neurons was theoretically determined and utilized. A small proof-of-concept classification problem was applied to a spiking neural network using both algorithms to classify 14 letters of English alphabet with an image size of 15 [Formula: see text] 15 pixels. For both schemes, a modified spike-timing-dependent-plasticity (STDP) learning rule has been used to train the spiking neurons in an unsupervised fashion. The performance and the number of neurons required to perform this computation are compared between the two algorithms. We show that by tolerating a small drop in performance accuracy (84% in WSA versus 91% in WTA), we are able to reduce the number of output neurons by more than a factor of two. We show how the reduction in the number of neurons will increase as the number of patterns increases. The reduction in the number of output neurons would then proportionally reduce the number of training parameters, which requires less memory and hence speeds up the computation, and in the case of neuromorphic implementation on silicon, would take up much less area.

Year:  2017        PMID: 29220307     DOI: 10.1162/neco_a_01047

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  1 in total

1.  Self-organization of an inhomogeneous memristive hardware for sequence learning.

Authors:  Melika Payvand; Filippo Moro; Kumiko Nomura; Thomas Dalgaty; Elisa Vianello; Yoshifumi Nishi; Giacomo Indiveri
Journal:  Nat Commun       Date:  2022-10-02       Impact factor: 17.694

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.