| Literature DB >> 20608871 |
Dongpo Xu1, Huisheng Zhang, Lijun Liu.
Abstract
This letter presents a unified convergence analysis of the split-complex nonlinear gradient descent (SCNGD) learning algorithms for complex-valued recurrent neural networks, covering three classes of SCNGD algorithms: standard SCNGD, normalized SCNGD, and adaptive normalized SCNGD. We prove that if the activation functions are of split-complex type and some conditions are satisfied, the error function is monotonically decreasing during the training iteration process, and the gradients of the error function with respect to the real and imaginary parts of the weights converge to zero. A strong convergence result is also obtained under the assumption that the error function has only a finite number of stationary points. The simulation results are given to support the theoretical analysis.Mesh:
Year: 2010 PMID: 20608871 DOI: 10.1162/NECO_a_00021
Source DB: PubMed Journal: Neural Comput ISSN: 0899-7667 Impact factor: 2.026