| Literature DB >> 33265289 |
Zhen Zhang1, Yibing Li1, Shanshan Jin1, Zhaoyue Zhang2, Hui Wang1, Lin Qi1, Ruolin Zhou3.
Abstract
In this paper, information entropy and ensemble learning based signal recognition theory and algorithms have been proposed. We have extracted 16 kinds of entropy features out of 9 types of modulated signals. The types of information entropy used are numerous, including Rényi entropy and energy entropy based on S Transform and Generalized S Transform. We have used three feature selection algorithms, including sequence forward selection (SFS), sequence forward floating selection (SFFS) and RELIEF-F to select the optimal feature subset from 16 entropy features. We use five classifiers, including k-nearest neighbor (KNN), support vector machine (SVM), Adaboost, Gradient Boosting Decision Tree (GBDT) and eXtreme Gradient Boosting (XGBoost) to classify the original feature set and the feature subsets selected by different feature selection algorithms. The simulation results show that the feature subsets selected by SFS and SFFS algorithms are the best, with a 48% increase in recognition rate over the original feature set when using KNN classifier and a 34% increase when using SVM classifier. For the other three classifiers, the original feature set can achieve the best recognition performance. The XGBoost classifier has the best recognition performance, the overall recognition rate is 97.74% and the recognition rate can reach 82% when the signal to noise ratio (SNR) is -10 dB.Entities:
Keywords: ensemble learning; entropy feature; feature selection; radar
Year: 2018 PMID: 33265289 PMCID: PMC7512713 DOI: 10.3390/e20030198
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1The commonly used recognition framework.
Figure 2The variation curve of common information entropy with the SNR. (a) Power spectrum Shannon entropy; (b) Power spectrum exponential entropy; (c) Singular spectrum Shannon entropy; (d) Singular spectrum exponential entropy; (e) Wavelet energy spectrum entropy; (f) Bispectrum entropy; (g) Approximate entropy; (h) Sample entropy; (i) Fuzzy entropy.
Figure 3The variation curve of information entropy based on time-frequency analysis with the SNR. (a) Rényi entropy of STFT; (b) Rényi entropy of SPWVD; (c) Rényi entropy of Wavelet Transform; (d) Rényi entropy of S Transform; (e) Rényi entropy of Generalized S Transform; (f) Energy entropy of S Transform; (g) Energy entropy of Generalized S Transform.
The simulation time of different entropy features (s).
| Entropy | Time |
|---|---|
| Power spectrum Shannon entropy | 0.199 |
| Power spectrum exponential entropy | 0.210 |
| Singular spectrum Shannon entropy | 0.205 |
| Singular spectrum exponential entropy | 0.204 |
| Wavelet energy spectrum entropy | 0.558 |
| Bispectrum entropy | 2.414 |
| Approximate entropy | 683.003 |
| Sample entropy | 396.102 |
| Fuzzy entropy | 428.461 |
| Rényi entropy of STFT | 162.988 |
| Rényi entropy of SPWVD | 156.508 |
| Rényi entropy of Wavelet Transform | 166.227 |
| Rényi entropy of S Transform | 10.224 |
| Rényi entropy of Generalized S Transform | 9.986 |
| Energy entropy of S Transform | 7.043 |
| Energy entropy of Generalized S Transform | 6.974 |
The size of feature subsets obtained by different feature selection algorithms.
| Algorithm | No | SFS | SFFS | RELIEF-F |
|---|---|---|---|---|
| Features | 16 | 7 | 7 | 6 |
The recognition rate of feature subsets obtained by different feature selection algorithm.
| Algorithm | NO | SFS/SFFS | RELIEF-F |
|---|---|---|---|
| KNN | 47.76% | 49.53% | |
| SVM | 57.93% | 56.39% | |
| Adaboost | 97.19% | 95.70% | |
| GBDT | 97.16% | 95.70% | |
| XGBoost | 97.40% | 95.91% |
Figure 4The recognition rate of feature subsets obtained by different feature selection algorithm at different SNRs. (a) KNN classifier; (b) SVM classifier; (c) Adaboost classifier; (d) GBDT classifier; (e) XGBoost classifier.
The simulation time of different feature selection algorithms (s).
| Algorithm | Time |
|---|---|
| SFS | 465.909 |
| SFFS | 735.793 |
| RELIEF-F | 3.467 |
The simulation time of different classifiers of different feature selection algorithms (s).
| Algorithm | NO | SFS/SFFS | RELIEF-F |
|---|---|---|---|
| KNN | 5.179 | 2.075 | 1.856 |
| SVM | 2352.019 | 124.972 | 2495.665 |
| Adaboost | 11.544 | 5.507 | 4.914 |
| GBDT | 36.644 | 18.049 | 17.659 |
| XGBoost | 13.276 | 7.784 | 6.973 |
Figure 5The recognition rate of different features at different SNRs. (a) KNN classifier; (b) SVM classifier; (c) Adaboost classifier; (d) GBDT classifier; (e) XGBoost classifier.