| Literature DB >> 11052418 |
Abstract
Simulations indicate that the deterministic Boltzmann machine, unlike the stochastic Boltzmann machine from which it is derived, exhibits unstable behavior during contrastive Hebbian learning of nonlinear problems, including oscillation in the learning algorithm and extreme sensitivity to small weight perturbations. Although careful choice of the initial weight magnitudes, the learning rate, and the annealing schedule will produce convergence in most cases, the stability of the resulting solution depends on the parameters in a complex and generally indiscernible way. We show that this unstable behavior is the result of over parameterization (excessive freedom in the weights), which leads to continuous rather than isolated optimal weight solution sets. This allows the weights to drift without correction by the learning algorithm until the free energy landscape changes in such a way that the settling procedure employed finds a different minimum of the free energy function than it did previously and a gross output error occurs. Because all the weight sets in a continuous optimal solution set produce exactly the same network outputs, we define reliability, a measure of the robustness of the network, as a new performance criterion.Mesh:
Year: 2000 PMID: 11052418 DOI: 10.1142/S0129065700000284
Source DB: PubMed Journal: Int J Neural Syst ISSN: 0129-0657 Impact factor: 5.866