| Literature DB >> 34945917 |
Gianmarco Baldini1, Jean-Marc Chareau1, Fausto Bonavitacola2.
Abstract
Spectrum sensing is an important function in radio frequency spectrum management and cognitive radio networks. Spectrum sensing is used by one wireless system (e.g., a secondary user) to detect the presence of a wireless service with higher priority (e.g., a primary user) with which it has to coexist in the radio frequency spectrum. If the wireless signal is detected, the second user system releases the given frequency to maintain the principle of not interfering. This paper proposes a machine learning implementation of spectrum sensing using the entropy measure as a feature vector. In the training phase, the information about the activity of the wireless service with higher priority is gathered, and the model is formed. In the classification phase, the wireless system compares the current sensing report to the created model to calculate the posterior probability and classify the sensing report into either the presence or absence of wireless service with higher priority. This paper proposes the novel application of the Fluctuation Dispersion Entropy (FDE) measure recently introduced in the research community as a feature vector to build the model and implement the classification. An improved implementation of the FDE (IFDE) is used to enhance the robustness to noise. IFDE is further enhanced with an adaptive method (AIFDE) to automatically select the hyper-parameter introduced in IFDE. Then, this paper combines the machine learning approach with the entropy measure approach, which are both recent developments in spectrum sensing research. The approach is compared to similar approaches in literature and the classical energy detection method using a generated radar signal data set with different conditions of SNR(dB) and fading conditions. The results show that the proposed approach is able to outperform the approaches from literature based on other entropy measures or the Energy Detector (ED) in a consistent way across different levels of SNR and fading conditions.Entities:
Keywords: entropy; machine learning; signal processing; spectrum sensing
Year: 2021 PMID: 34945917 PMCID: PMC8699852 DOI: 10.3390/e23121611
Source DB: PubMed Journal: Entropy (Basel) ISSN: 1099-4300 Impact factor: 2.524
Figure 1Overall methodology for the spectrum sensing implementation.
Figure 2Test bed for the generation of the signals and data collection.
Figure 3Image of the Weather Radar bursts in fading conditions based on the TDL-A model of [31].
Identification and description of the main hyper-parameters used in this study including their optimal values and the range where the optimization was performed.
| Hyper-Parameters of the Machine Learning Algorithms | Parameters |
|---|---|
| SVM | RBF scaling factor |
| KNN | |
| Decision Tree | |
|
| |
| DE | embedding dimension |
| IDE and AIDE | embedding dimension |
| FDE | embedding dimension |
| IFDE and AIFDE | embedding dimension |
Figure 4Impact of m and c values for the different fading models and different values of SNR in dB.
Figure 5Impact of the mapping functions for different fading models and different values of SNR in dB.
Figure 6Sum of the differences for from the optimal value (across all the values of SNR considered in the study) for different window sizes using IFDE. The line indicates the sum of the differences for the adaptive method (i.e., AIFDE) to show the comparison with the different values of the window size . Each figure represents a different channel propagation model (i.e., TDL from [31]).
Figure 7with different sizes of the data sets reduced from the initial data set through the instance selection process. AIFDE was used.
Figure 8Bar graphs of the with different sizes of the data sets reduced from the initial data set through the instance selection process for dB and dB. AIFDE was used.
Figure 9Results for the comparison of IFDE and IDE against FDE and DE on the basis of and across different TDL models and different levels of SNR.
Figure 10for different machine learning algorithms. AIFDE is used.
Figure 11Results for the comparison of across different TDL models.
Figure 12Results for the comparison of across different TDL models.
Comparison of the approaches: detailed values of and for SNR dB and SNR dB.
| Approach | ||||
|---|---|---|---|---|
|
| ||||
| ED | 0.0083 | 0.08 | 0.9916 | 0.919 |
| Shannon entropy | 0.00447 | 0.5322 | 0.955 | 0.4677 |
| Renyi entropy | 0.0255 | 0.5162 | 0.975 | 0.50 |
| Renyi entropy | 0.028 | 0.4953 | 0.975 | 0.504 |
| Renyi entropy | 0.0156 | 0.4104 | 0.984 | 0.589 |
| IDE ( | 0.4229 | 0.7515 | 0.577 | 0.248 |
| IDE ( | 0.2984 | 0.735 | 0.714 | 0.264 |
| AIDE | 0.2854 | 0.742 | 0.701 | 0.2578 |
| IFDE ( | 0.4322 | 0.826 | 0.567 | 0.1694 |
| IFDE ( | 0.365 | 0.823 | 0.634 | 0.1706 |
| AIFDE | 0.3958 | 0.827 | 0.604 | 0.1729 |
|
| ||||
| ED | 0.001 | 0.1125 | 0.9989 | 0.8875 |
| Shannon entropy | 0.0029 | 0.5489 | 0.973 | 0.451 |
| Renyi entropy | 0.021 | 0.516 | 0.98 | 0.4833 |
| Renyi entropy | 0.019 | 0.464 | 0.98 | 0.535 |
| Renyi entropy | 0.012 | 0.401 | 0.998 | 0.598 |
| IDE ( | 0.294 | 0.752 | 0.705 | 0.247 |
| IDE ( | 0.302 | 0.723 | 0.6975 | 0.276 |
| AIDE | 0.42 | 0.741 | 0.575 | 0.258 |
| IFDE ( | 0.38 | 0.801 | 0.619 | 0.1937 |
| IFDE ( | 0.455 | 0.8 | 0.577 | 0.1963 |
| AIFDE | 0.425 | 0.806 | 0.585 | 0.187 |
|
| ||||
| ED | 0.0078 | 0.1567 | 0.992 | 0.843 |
| Shannon entropy | 0.0023 | 0.4953 | 0.976 | 0.5047 |
| Renyi entropy | 0.0177 | 0.467 | 0.982 | 0.5328 |
| Renyi entropy | 0.0093 | 0.414 | 0.986 | 0.5854 |
| Renyi entropy | 0.005 | 0.344 | 0.9948 | 0.6557 |
| IDE ( | 0.38 | 0.768 | 0.6135 | 0.2318 |
| IDE ( | 0.28 | 0.793 | 0.7198 | 0.2068 |
| AIDE | 0.2755 | 0.786 | 0.7245 | 0.2135 |
| IFDE ( | 0.397 | 0.838 | 0.6026 | 0.1615 |
| IFDE ( | 0.421 | 0.824 | 0.5781 | 0.1745 |
| AIFDE | 0.4828 | 0.826 | 0.5172 | 0.174 |
|
| ||||
| ED | 0.013 | 0.232 | 0.988 | 0.768 |
| Shannon entropy | 0.010 | 0.366 | 0.990 | 0.634 |
| Renyi entropy | 0.007 | 0.309 | 0.993 | 0.691 |
| Renyi entropy | 0.003 | 0.216 | 0.997 | 0.784 |
| Renyi entropy | 0.006 | 0.207 | 0.994 | 0.793 |
| IDE ( | 0.180 | 0.535 | 0.820 | 0.465 |
| IDE ( | 0.226 | 0.494 | 0.774 | 0.506 |
| AIDE | 0.209 | 0.485 | 0.791 | 0.515 |
| IFDE ( | 0.226 | 0.561 | 0.774 | 0.439 |
| IFDE ( | 0.197 | 0.513 | 0.803 | 0.488 |
| AIFDE | 0.221 | 0.528 | 0.779 | 0.472 |
|
| ||||
| ED | 0.010 | 0.105 | 0.990 | 0.895 |
| Shannon entropy | 0.021 | 0.488 | 0.979 | 0.513 |
| Renyi entropy | 0.019 | 0.481 | 0.981 | 0.519 |
| Renyi entropy | 0.004 | 0.412 | 0.996 | 0.588 |
| Renyi entropy | 0.005 | 0.419 | 0.995 | 0.581 |
| IDE ( | 0.336 | 0.667 | 0.664 | 0.333 |
| IDE ( | 0.262 | 0.694 | 0.738 | 0.306 |
| AIDE | 0.233 | 0.689 | 0.767 | 0.311 |
| IFDE ( | 0.392 | 0.682 | 0.608 | 0.318 |
| IFDE ( | 0.357 | 0.731 | 0.643 | 0.269 |
| AIFDE | 0.368 | 0.737 | 0.632 | 0.263 |