Literature DB >> 33351769

A Hybrid Recursive Implementation of Broad Learning With Incremental Features.

Di Liu, Simone Baldi, Wenwu Yu, C L Philip Chen.   

Abstract

The broad learning system (BLS) paradigm has recently emerged as a computationally efficient approach to supervised learning. Its efficiency arises from a learning mechanism based on the method of least-squares. However, the need for storing and inverting large matrices can put the efficiency of such mechanism at risk in big-data scenarios. In this work, we propose a new implementation of BLS in which the need for storing and inverting large matrices is avoided. The distinguishing features of the designed learning mechanism are as follows: 1) the training process can balance between efficient usage of memory and required iterations (hybrid recursive learning) and 2) retraining is avoided when the network is expanded (incremental learning). It is shown that, while the proposed framework is equivalent to the standard BLS in terms of trained network weights,much larger networks than the standard BLS can be smoothly trained by the proposed solution, projecting BLS toward the big-data frontier.

Entities:  

Year:  2022        PMID: 33351769     DOI: 10.1109/TNNLS.2020.3043110

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  1 in total

1.  An Animation Model Generation Method Based on Gaussian Mutation Genetic Algorithm to Optimize Neural Network.

Authors:  Jing Liu; Qixing Chen; Yihua Zhang; Xiaoying Tian
Journal:  Comput Intell Neurosci       Date:  2022-06-03
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.