Literature DB >> 17517495

Optimization and applications of echo state networks with leaky-integrator neurons.

Herbert Jaeger1, Mantas Lukosevicius, Dan Popovici, Udo Siewert.   

Abstract

Standard echo state networks (ESNs) are built from simple additive units with a sigmoid activation function. Here we investigate ESNs whose reservoir units are leaky integrator units. Units of this type have individual state dynamics, which can be exploited in various ways to accommodate the network to the temporal characteristics of a learning task. We present stability conditions, introduce and investigate a stochastic gradient descent method for the optimization of the global learning parameters (input and output feedback scalings, leaking rate, spectral radius) and demonstrate the usefulness of leaky-integrator ESNs for (i) learning very slow dynamic systems and replaying the learnt system at different speeds, (ii) classifying relatively slow and noisy time series (the Japanese Vowel dataset--here we obtain a zero test error rate), and (iii) recognizing strongly time-warped dynamic patterns.

Entities:  

Mesh:

Year:  2007        PMID: 17517495     DOI: 10.1016/j.neunet.2007.04.016

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  28 in total

Review 1.  Evolutionary aspects of reservoir computing.

Authors:  Luís F Seoane
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2019-06-10       Impact factor: 6.237

2.  A machine-learning approach for long-term prediction of experimental cardiac action potential time series using an autoencoder and echo state networks.

Authors:  Shahrokh Shahi; Flavio H Fenton; Elizabeth M Cherry
Journal:  Chaos       Date:  2022-06       Impact factor: 3.741

3.  Prediction of chaotic time series using recurrent neural networks and reservoir computing techniques: A comparative study.

Authors:  Shahrokh Shahi; Flavio H Fenton; Elizabeth M Cherry
Journal:  Mach Learn Appl       Date:  2022-04-09

4.  Extending stability through hierarchical clusters in echo state networks.

Authors:  Sarah Jarvis; Stefan Rotter; Ulrich Egert
Journal:  Front Neuroinform       Date:  2010-07-07       Impact factor: 4.081

5.  Functional identification of biological neural networks using reservoir adaptation for point processes.

Authors:  Tayfun Gürel; Stefan Rotter; Ulrich Egert
Journal:  J Comput Neurosci       Date:  2009-07-29       Impact factor: 1.621

6.  A reservoir of time constants for memory traces in cortical neurons.

Authors:  Alberto Bernacchia; Hyojung Seo; Daeyeol Lee; Xiao-Jing Wang
Journal:  Nat Neurosci       Date:  2011-02-13       Impact factor: 24.884

7.  Optoelectronic reservoir computing.

Authors:  Y Paquot; F Duport; A Smerieri; J Dambre; B Schrauwen; M Haelterman; S Massar
Journal:  Sci Rep       Date:  2012-02-27       Impact factor: 4.379

8.  Optimal nonlinear information processing capacity in delay-based reservoir computers.

Authors:  Lyudmila Grigoryeva; Julie Henriques; Laurent Larger; Juan-Pablo Ortega
Journal:  Sci Rep       Date:  2015-09-11       Impact factor: 4.379

9.  Multiscale model of an inhibitory network shows optimal properties near bifurcation.

Authors:  Christopher L Buckley; Thomas Nowotny
Journal:  Phys Rev Lett       Date:  2011-06-10       Impact factor: 9.161

10.  Real-time parallel processing of grammatical structure in the fronto-striatal system: a recurrent network simulation study using reservoir computing.

Authors:  Xavier Hinaut; Peter Ford Dominey
Journal:  PLoS One       Date:  2013-02-01       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.