Literature DB >> 33396383

Optimal Randomness for Stochastic Configuration Network (SCN) with Heavy-Tailed Distributions.

Haoyu Niu1, Jiamin Wei2, YangQuan Chen1.   

Abstract

Stochastic Configuration Network (SCN) has a powerful capability for regression and classification analysis. Traditionally, it is quite challenging to correctly determine an appropriate architecture for a neural network so that the trained model can achieve excellent performance for both learning and generalization. Compared with the known randomized learning algorithms for single hidden layer feed-forward neural networks, such as Randomized Radial Basis Function (RBF) Networks and Random Vector Functional-link (RVFL), the SCN randomly assigns the input weights and biases of the hidden nodes in a supervisory mechanism. Since the parameters in the hidden layers are randomly generated in uniform distribution, hypothetically, there is optimal randomness. Heavy-tailed distribution has shown optimal randomness in an unknown environment for finding some targets. Therefore, in this research, the authors used heavy-tailed distributions to randomly initialize weights and biases to see if the new SCN models can achieve better performance than the original SCN. Heavy-tailed distributions, such as Lévy distribution, Cauchy distribution, and Weibull distribution, have been used. Since some mixed distributions show heavy-tailed properties, the mixed Gaussian and Laplace distributions were also studied in this research work. Experimental results showed improved performance for SCN with heavy-tailed distributions. For the regression model, SCN-Lévy, SCN-Mixture, SCN-Cauchy, and SCN-Weibull used less hidden nodes to achieve similar performance with SCN. For the classification model, SCN-Mixture, SCN-Lévy, and SCN-Cauchy have higher test accuracy of 91.5%, 91.7% and 92.4%, respectively. Both are higher than the test accuracy of the original SCN.

Entities:  

Keywords:  Cauchy; Lévy; SCN; Weibull; heavy-tailed distribution; optimal randomness

Year:  2020        PMID: 33396383      PMCID: PMC7823536          DOI: 10.3390/e23010056

Source DB:  PubMed          Journal:  Entropy (Basel)        ISSN: 1099-4300            Impact factor:   2.524


  3 in total

1.  2-D Stochastic Configuration Networks for Image Data Analytics.

Authors:  Ming Li; Dianhui Wang
Journal:  IEEE Trans Cybern       Date:  2020-12-22       Impact factor: 11.448

2.  Stochastic Configuration Networks: Fundamentals and Algorithms.

Authors:  Dianhui Wang; Ming Li
Journal:  IEEE Trans Cybern       Date:  2017-08-21       Impact factor: 11.448

3.  Ensemble Stochastic Configuration Networks for Estimating Prediction Intervals: A Simultaneous Robust Training Algorithm and Its Application.

Authors:  Jun Lu; Jinliang Ding; Xuewu Dai; Tianyou Chai
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2020-02-13       Impact factor: 10.451

  3 in total
  1 in total

1.  Whether the Support Region of Three-Bit Uniform Quantizer Has a Strong Impact on Post-Training Quantization for MNIST Dataset?

Authors:  Jelena Nikolić; Zoran Perić; Danijela Aleksić; Stefan Tomić; Aleksandra Jovanović
Journal:  Entropy (Basel)       Date:  2021-12-20       Impact factor: 2.524

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.