Literature DB >> 22885243

Re-visiting the echo state property.

Izzet B Yildiz1, Herbert Jaeger, Stefan J Kiebel.   

Abstract

An echo state network (ESN) consists of a large, randomly connected neural network, the reservoir, which is driven by an input signal and projects to output units. During training, only the connections from the reservoir to these output units are learned. A key requisite for output-only training is the echo state property (ESP), which means that the effect of initial conditions should vanish as time passes. In this paper, we use analytical examples to show that a widely used criterion for the ESP, the spectral radius of the weight matrix being smaller than unity, is not sufficient to satisfy the echo state property. We obtain these examples by investigating local bifurcation properties of the standard ESNs. Moreover, we provide new sufficient conditions for the echo state property of standard sigmoid and leaky integrator ESNs. We furthermore suggest an improved technical definition of the echo state property, and discuss what practicians should (and should not) observe when they optimize their reservoirs for specific tasks.
Copyright © 2012 Elsevier Ltd. All rights reserved.

Mesh:

Year:  2012        PMID: 22885243     DOI: 10.1016/j.neunet.2012.07.005

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  11 in total

1.  Characterization of the non-stationary nature of steady-state visual evoked potentials using echo state networks.

Authors:  David Ibáñez-Soria; Aureli Soria-Frisch; Jordi Garcia-Ojalvo; Giulio Ruffini
Journal:  PLoS One       Date:  2019-07-05       Impact factor: 3.240

2.  Prediction of chaotic time series using recurrent neural networks and reservoir computing techniques: A comparative study.

Authors:  Shahrokh Shahi; Flavio H Fenton; Elizabeth M Cherry
Journal:  Mach Learn Appl       Date:  2022-04-09

3.  Learning Universal Computations with Spikes.

Authors:  Dominik Thalmeier; Marvin Uhlmann; Hilbert J Kappen; Raoul-Martin Memmesheimer
Journal:  PLoS Comput Biol       Date:  2016-06-16       Impact factor: 4.475

4.  Computing with networks of nonlinear mechanical oscillators.

Authors:  Jean C Coulombe; Mark C A York; Julien Sylvestre
Journal:  PLoS One       Date:  2017-06-02       Impact factor: 3.240

5.  Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere.

Authors:  Pietro Verzelli; Cesare Alippi; Lorenzo Livi
Journal:  Sci Rep       Date:  2019-09-25       Impact factor: 4.379

6.  Tailoring Echo State Networks for Optimal Learning.

Authors:  Pau Vilimelis Aceituno; Gang Yan; Yang-Yu Liu
Journal:  iScience       Date:  2020-08-06

7.  Model-size reduction for reservoir computing by concatenating internal states through time.

Authors:  Yusuke Sakemi; Kai Morino; Timothée Leleu; Kazuyuki Aihara
Journal:  Sci Rep       Date:  2020-12-11       Impact factor: 4.379

Review 8.  Neuronal Sequence Models for Bayesian Online Inference.

Authors:  Sascha Frölich; Dimitrije Marković; Stefan J Kiebel
Journal:  Front Artif Intell       Date:  2021-05-21

9.  Real-time parallel processing of grammatical structure in the fronto-striatal system: a recurrent network simulation study using reservoir computing.

Authors:  Xavier Hinaut; Peter Ford Dominey
Journal:  PLoS One       Date:  2013-02-01       Impact factor: 3.240

10.  Multiplex visibility graphs to investigate recurrent neural network dynamics.

Authors:  Filippo Maria Bianchi; Lorenzo Livi; Cesare Alippi; Robert Jenssen
Journal:  Sci Rep       Date:  2017-03-10       Impact factor: 4.379

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.