| Literature DB >> 30839667 |
Zhilei Chai1,2, Wei Song1,3,2, Qinxin Bao1, Feng Ding1, Fei Liu1.
Abstract
The growing and pruning radial basis function (GAP-RBF) network is a promising sequential learning algorithm for prediction analysis, but the parameter selection of such a network is usually a non-convex problem and makes it difficult to handle. In this paper, a hybrid bioinspired intelligent algorithm is proposed to optimize GAP-RBF. Specifically, the excellent local convergence of particle swarm optimization (PSO) and the extensive search ability of genetic algorithm (GA) are both considered to optimize the weights and bias term of GAP-RBF. Meanwhile, a competitive mechanism is proposed to make the hybrid algorithm choose the appropriate individuals for effective search and further improve its optimization ability. Moreover, a decoupled extended Kalman filter (DEKF) method is introduced in this study to reduce the size of error covariance matrix and decrease the computational complexity for performing real-time predictions. In the experiments, three classic forecasting issues including abalone age, Boston house price and auto MPG are adopted for extensive test, and the experimental results show that our method performs better than PSO and GA these two single bioinspired optimization algorithms. What is more, our method via DEKF achieves the better results in comparison with the state-of-art sequential learning algorithms, such as GAP-RBF, minimal resource allocation network, resource allocation network using an extended Kalman filter and resource allocation network.Entities:
Keywords: GAP-RBF; bioinspired intelligent algorithm; competitive mechanism; prediction analysis
Year: 2018 PMID: 30839667 PMCID: PMC6170552 DOI: 10.1098/rsos.180529
Source DB: PubMed Journal: R Soc Open Sci ISSN: 2054-5703 Impact factor: 2.963
Figure 1.Population sorting for HBIACM.
The details of abalone, Boston house price and auto MPG datasets.
| datasets | number of training samples | number of testing samples |
|---|---|---|
| abalone | 3000 | 1177 |
| Boston house price | 481 | 25 |
| auto MPG | 320 | 78 |
Figure 2.The comparison of training RMS error for three optimization algorithms (abalone dataset).
Figure 3.The comparison of training RMS error for three optimization algorithms (Boston house price dataset).
Figure 4.The comparison of training RMS error for three optimization algorithms (auto MPG dataset).
The parameters for each algorithm.
| datasets | algorithms | parameters |
|---|---|---|
| abalone | RAN | |
| RANEKF | ||
| MRAN | ||
| GAP-RBF | ||
| Boston house price | RAN | |
| RANEKF | ||
| MRAN | ||
| GAP-RBF | ||
| auto MPG | RAN | |
| RANEKF | ||
| MRAN | ||
| GAP-RBF |
The performance comparison for abalone dataset.
| algorithms | CPU time (s) | training RMS errors | testing RMS errors | no. hidden neurons | ||||
|---|---|---|---|---|---|---|---|---|
| mean | s.d. | mean | s.d. | mean | s.d. | mean | s.d. | |
| GAP-RBF+DEKF+HBIACM | ||||||||
| GAP-RBF+HBIACM | 99.769 | 76.5411 | 0.0824 | 0.0021 | 0.0828 | 0.0032 | 19.72 | 3.3505 |
| GAP-RBF | 83.784 | 73.401 | 0.0963 | 0.0061 | 0.0966 | 0.0068 | 23.62 | 9.5081 |
| MRAN | 1500.4 | 134.08 | 0.0836 | 0.0039 | 0.0837 | 0.0042 | 87.57 | 7.1147 |
| RANEKF | 90 806 | 18 193 | 0.0738 | 0.0042 | 0.0794 | 0.0053 | 409.00 | 22.485 |
| RAN | 105.17 | 6.1714 | 0.0931 | 0.0091 | 0.0978 | 0.0092 | 345.58 | 12.578 |
The performance comparison for auto MPG dataset.
| algorithms | CPU time (s) | training RMS errors | testing RMS errors | no. hidden neurons | ||||
|---|---|---|---|---|---|---|---|---|
| mean | s.d. | mean | s.d. | mean | s.d. | mean | s.d. | |
| GAP-RBF+DEKF+HBIACM | ||||||||
| GAP-RBF+HBIACM | 1.4308 | 0.1409 | 0.1097 | 0.0103 | 0.1177 | 0.0158 | 3.94 | 0.7398 |
| GAP-RBF | 0.4520 | 0.0786 | 0.1144 | 0.0132 | 0.1404 | 0.0270 | 3.12 | 0.7462 |
| MRAN | 1.4644 | 0.2453 | 0.1086 | 0.0100 | 0.1376 | 0.0226 | 4.46 | 0.7343 |
| RANEKF | 1.0103 | 0.1694 | 0.1088 | 0.0117 | 0.1387 | 0.0289 | 5.14 | 0.9037 |
| RAN | 0.8042 | 0.1417 | 0.2923 | 0.0808 | 0.3080 | 0.0915 | 4.44 | 0.8369 |
The performance comparison for Boston house price dataset.
| algorithms | CPU time (s) | training RMS errors | testing RMS errors | no. hidden neurons | ||||
|---|---|---|---|---|---|---|---|---|
| mean | s.d. | mean | s.d. | mean | s.d. | mean | s.d. | |
| GAP-RBF+DEKF+HBIACM | ||||||||
| GAP-RBF+HBIACM | 2.5291 | 0.4252 | 0.1379 | 0.0126 | 0.1285 | 0.0350 | 4.54 | 0.6131 |
| GAP-RBF | 1.2399 | 0.2812 | 0.1507 | 0.0128 | 0.1418 | 0.0466 | 3.50 | 0.6468 |
| MRAN | 12.731 | 2.2585 | 0.1440 | 0.0108 | 0.1356 | 0.0411 | 13.58 | 1.8962 |
| RANEKF | 22.572 | 6.4159 | 0.1328 | 0.0086 | 0.1437 | 0.0464 | 19.98 | 1.8349 |
| RAN | 4.2664 | 0.4846 | 0.3449 | 0.0620 | 0.3432 | 0.0770 | 18.80 | 1.6413 |