Literature DB >> 18263540

Incremental learning of complex temporal patterns.

D L Wang1, B Yuwono.   

Abstract

A neural model for temporal pattern generation is used and analyzed for training with multiple complex sequences in a sequential manner. The network exhibits some degree of interference when new sequences are acquired. It is proven that the model is capable of incrementally learning a finite number of complex sequences. The model is then evaluated with a large set of highly correlated sequences. While the number of intact sequences increases linearly with the number of previously acquired sequences, the amount of retraining due to interference appears to be independent of the size of existing memory. The model is extended to include a chunking network which detects repeated subsequences between and within sequences. The chunking mechanism substantially reduces the amount of retraining in sequential training. Thus, the network investigated here constitutes an effective sequential memory. Various aspects of such a memory are discussed.

Year:  1996        PMID: 18263540     DOI: 10.1109/72.548174

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  1 in total

1.  A Self-Organizing Incremental Spatiotemporal Associative Memory Networks Model for Problems with Hidden State.

Authors:  Zuo-Wei Wang
Journal:  Comput Intell Neurosci       Date:  2016-11-03
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.