Literature DB >> 18267822

First-order versus second-order single-layer recurrent neural networks.

M W Goudreau1, C L Giles, S T Chakradhar, D Chen.   

Abstract

We examine the representational capabilities of first-order and second-order single-layer recurrent neural networks (SLRNN's) with hard-limiting neurons. We show that a second-order SLRNN is strictly more powerful than a first-order SLRNN. However, if the first-order SLRNN is augmented with output layers of feedforward neurons, it can implement any finite-state recognizer, but only if state-splitting is employed. When a state is split, it is divided into two equivalent states. The judicious use of state-splitting allows for efficient implementation of finite-state recognizers using augmented first-order SLRNN's.

Year:  1994        PMID: 18267822     DOI: 10.1109/72.286928

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  2 in total

1.  An Entropy Metric for Regular Grammar Classification and Learning with Recurrent Neural Networks.

Authors:  Kaixuan Zhang; Qinglong Wang; C Lee Giles
Journal:  Entropy (Basel)       Date:  2021-01-19       Impact factor: 2.524

2.  Initialization of latent space coordinates via random linear projections for learning robotic sensory-motor sequences.

Authors:  Vsevolod Nikulin; Jun Tani
Journal:  Front Neurorobot       Date:  2022-09-14       Impact factor: 3.493

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.