Literature DB >> 24808458

Robustness analysis of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances.

Yi Shen, Jun Wang.   

Abstract

In recent years, the global stability of recurrent neural networks (RNNs) has been investigated extensively. It is well known that time delays and external disturbances can derail the stability of RNNs. In this paper, we analyze the robustness of global stability of RNNs subject to time delays and random disturbances. Given a globally exponentially stable neural network, the problem to be addressed here is how much time delay and noise the RNN can withstand to be globally exponentially stable in the presence of delay and noise. The upper bounds of the time delay and noise intensity are characterized by using transcendental equations for the RNNs to sustain global exponential stability. Moreover, we prove theoretically that, for any globally exponentially stable RNNs, if additive noises and time delays are smaller than the derived lower bounds arrived at here, then the perturbed RNNs are guaranteed to also be globally exponentially stable. Three numerical examples are provided to substantiate the theoretical results.

Mesh:

Year:  2012        PMID: 24808458     DOI: 10.1109/TNNLS.2011.2178326

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  2 in total

1.  Complex Dynamics of Noise-Perturbed Excitatory-Inhibitory Neural Networks With Intra-Correlative and Inter-Independent Connections.

Authors:  Xiaoxiao Peng; Wei Lin
Journal:  Front Physiol       Date:  2022-06-24       Impact factor: 4.755

2.  A Non-spiking Neuron Model With Dynamic Leak to Avoid Instability in Recurrent Networks.

Authors:  Udaya B Rongala; Jonas M D Enander; Matthias Kohler; Gerald E Loeb; Henrik Jörntell
Journal:  Front Comput Neurosci       Date:  2021-05-20       Impact factor: 2.380

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.