Literature DB >> 18220186

Collective behavior of a small-world recurrent neural system with scale-free distribution.

Zhidong Deng1, Yi Zhang.   

Abstract

This paper proposes a scale-free highly clustered echo state network (SHESN). We designed the SHESN to include a naturally evolving state reservoir according to incremental growth rules that account for the following features: (1) short characteristic path length, (2) high clustering coefficient, (3) scale-free distribution, and (4) hierarchical and distributed architecture. This new state reservoir contains a large number of internal neurons that are sparsely interconnected in the form of domains. Each domain comprises one backbone neuron and a number of local neurons around this backbone. Such a natural and efficient recurrent neural system essentially interpolates between the completely regular Elman network and the completely random echo state network (ESN) proposed by Jaeger et al. We investigated the collective characteristics of the proposed complex network model. We also successfully applied it to challenging problems such as the Mackey-Glass (MG) dynamic system and the laser time-series prediction. Compared to the ESN, our experimental results show that the SHESN model has a significantly enhanced echo state property and better performance in approximating highly complex nonlinear dynamics. In a word, this large scale dynamic complex network reflects some natural characteristics of biological neural systems in many aspects such as power law, small-world property, and hierarchical architecture. It should have strong computing power, fast signal propagation speed, and coherent synchronization.

Mesh:

Year:  2007        PMID: 18220186     DOI: 10.1109/tnn.2007.894082

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  7 in total

1.  Nonlinear system modeling with random matrices: echo state networks revisited.

Authors:  Bai Zhang; David J Miller; Yue Wang
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2012-01       Impact factor: 10.451

2.  Extending stability through hierarchical clusters in echo state networks.

Authors:  Sarah Jarvis; Stefan Rotter; Ulrich Egert
Journal:  Front Neuroinform       Date:  2010-07-07       Impact factor: 4.081

3.  A priori data-driven multi-clustered reservoir generation algorithm for echo state network.

Authors:  Xiumin Li; Ling Zhong; Fangzheng Xue; Anguo Zhang
Journal:  PLoS One       Date:  2015-04-13       Impact factor: 3.240

4.  The combination of circle topology and leaky integrator neurons remarkably improves the performance of echo state network on time series prediction.

Authors:  Fangzheng Xue; Qian Li; Xiumin Li
Journal:  PLoS One       Date:  2017-07-31       Impact factor: 3.240

5.  Heave compensation prediction based on echo state network with correntropy induced loss function.

Authors:  Xiaogang Huang; Dongge Lei; Lulu Cai; Tianhao Tang; Zhibin Wang
Journal:  PLoS One       Date:  2019-06-13       Impact factor: 3.240

6.  Guiding principle of reservoir computing based on "small-world" network.

Authors:  Ken-Ichi Kitayama
Journal:  Sci Rep       Date:  2022-10-06       Impact factor: 4.996

7.  Long-range temporal correlations in scale-free neuromorphic networks.

Authors:  Shota Shirai; Susant Kumar Acharya; Saurabh Kumar Bose; Joshua Brian Mallinson; Edoardo Galli; Matthew D Pike; Matthew D Arnold; Simon Anthony Brown
Journal:  Netw Neurosci       Date:  2020-04-01
  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.