| Literature DB >> 15940984 |
Wei Wu1, Guorui Feng, Zhengxue Li, Yuesheng Xu.
Abstract
Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.Entities:
Mesh:
Year: 2005 PMID: 15940984 DOI: 10.1109/TNN.2005.844903
Source DB: PubMed Journal: IEEE Trans Neural Netw ISSN: 1045-9227