Literature DB >> 27626967

An Online Structural Plasticity Rule for Generating Better Reservoirs.

Subhrajit Roy1, Arindam Basu2.   

Abstract

In this letter, we propose a novel neuro-inspired low-resolution online unsupervised learning rule to train the reservoir or liquid of liquid state machines. The liquid is a sparsely interconnected huge recurrent network of spiking neurons. The proposed learning rule is inspired from structural plasticity and trains the liquid through formating and eliminating synaptic connections. Hence, the learning involves rewiring of the reservoir connections similar to structural plasticity observed in biological neural networks. The network connections can be stored as a connection matrix and updated in memory by using address event representation (AER) protocols, which are generally employed in neuromorphic systems. On investigating the pairwise separation property, we find that trained liquids provide 1.36 0.18 times more interclass separation while retaining similar intraclass separation as compared to random liquids. Moreover, analysis of the linear separation property reveals that trained liquids are 2.05 0.27 times better than random liquids. Furthermore, we show that our liquids are able to retain the generalization ability and generality of random liquids. A memory analysis shows that trained liquids have 83.67 5.79 ms longer fading memory than random liquids, which have shown 92.8 5.03 ms fading memory for a particular type of spike train inputs. We also throw some light on the dynamics of the evolution of recurrent connections within the liquid. Moreover, compared to separation-driven synaptic modification', a recently proposed algorithm for iteratively refining reservoirs, our learning rule provides 9.30%, 15.21%, and 12.52% more liquid separations and 2.8%, 9.1%, and 7.9% better classification accuracies for 4, 8, and 12 class pattern recognition tasks, respectively.

Year:  2016        PMID: 27626967     DOI: 10.1162/NECO_a_00886

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  4 in total

1.  SpiLinC: Spiking Liquid-Ensemble Computing for Unsupervised Speech and Image Recognition.

Authors:  Gopalakrishnan Srinivasan; Priyadarshini Panda; Kaushik Roy
Journal:  Front Neurosci       Date:  2018-08-23       Impact factor: 4.677

2.  Analysis of Liquid Ensembles for Enhancing the Performance and Accuracy of Liquid State Machines.

Authors:  Parami Wijesinghe; Gopalakrishnan Srinivasan; Priyadarshini Panda; Kaushik Roy
Journal:  Front Neurosci       Date:  2019-05-28       Impact factor: 4.677

3.  Learning to Recognize Actions From Limited Training Examples Using a Recurrent Spiking Neural Model.

Authors:  Priyadarshini Panda; Narayan Srinivasa
Journal:  Front Neurosci       Date:  2018-03-02       Impact factor: 4.677

4.  Bio-Inspired Evolutionary Model of Spiking Neural Networks in Ionic Liquid Space.

Authors:  Ensieh Iranmehr; Saeed Bagheri Shouraki; Mohammad Mahdi Faraji; Nasim Bagheri; Bernabe Linares-Barranco
Journal:  Front Neurosci       Date:  2019-11-08       Impact factor: 4.677

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.