| Literature DB >> 24808934 |
Huisheng Zhang1, Xiaodong Liu2, Dongpo Xu3, Ying Zhang4.
Abstract
This paper considers the fully complex backpropagation algorithm (FCBPA) for training the fully complex-valued neural networks. We prove both the weak convergence and strong convergence of FCBPA under mild conditions. The decreasing monotonicity of the error functions during the training process is also obtained. The derivation and analysis of the algorithm are under the framework of Wirtinger calculus, which greatly reduces the description complexity. The theoretical results are substantiated by a simulation example.Entities:
Keywords: Complex-valued neural networks; Convergence; Fully complex backpropagation algorithm; Wirtinger calculus
Year: 2014 PMID: 24808934 PMCID: PMC4012068 DOI: 10.1007/s11571-013-9276-7
Source DB: PubMed Journal: Cogn Neurodyn ISSN: 1871-4080 Impact factor: 5.082