| Literature DB >> 18267822 |
M W Goudreau1, C L Giles, S T Chakradhar, D Chen.
Abstract
We examine the representational capabilities of first-order and second-order single-layer recurrent neural networks (SLRNN's) with hard-limiting neurons. We show that a second-order SLRNN is strictly more powerful than a first-order SLRNN. However, if the first-order SLRNN is augmented with output layers of feedforward neurons, it can implement any finite-state recognizer, but only if state-splitting is employed. When a state is split, it is divided into two equivalent states. The judicious use of state-splitting allows for efficient implementation of finite-state recognizers using augmented first-order SLRNN's.Year: 1994 PMID: 18267822 DOI: 10.1109/72.286928
Source DB: PubMed Journal: IEEE Trans Neural Netw ISSN: 1045-9227