Literature DB >> 20608871

Convergence analysis of three classes of split-complex gradient algorithms for complex-valued recurrent neural networks.

Dongpo Xu1, Huisheng Zhang, Lijun Liu.   

Abstract

This letter presents a unified convergence analysis of the split-complex nonlinear gradient descent (SCNGD) learning algorithms for complex-valued recurrent neural networks, covering three classes of SCNGD algorithms: standard SCNGD, normalized SCNGD, and adaptive normalized SCNGD. We prove that if the activation functions are of split-complex type and some conditions are satisfied, the error function is monotonically decreasing during the training iteration process, and the gradients of the error function with respect to the real and imaginary parts of the weights converge to zero. A strong convergence result is also obtained under the assumption that the error function has only a finite number of stationary points. The simulation results are given to support the theoretical analysis.

Mesh:

Year:  2010        PMID: 20608871     DOI: 10.1162/NECO_a_00021

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  1 in total

1.  Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus.

Authors:  Huisheng Zhang; Xiaodong Liu; Dongpo Xu; Ying Zhang
Journal:  Cogn Neurodyn       Date:  2014-01-03       Impact factor: 5.082

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.