| Literature DB >> 30140300 |
Abstract
Text mining is an important research direction, which involves several fields, such as information retrieval, information extraction, and text categorization. In this paper, we propose an efficient multiple classifier approach to text categorization based on swarm-optimized topic modelling. The Latent Dirichlet allocation (LDA) can overcome the high dimensionality problem of vector space model, but identifying appropriate parameter values is critical to performance of LDA. Swarm-optimized approach estimates the parameters of LDA, including the number of topics and all the other parameters involved in LDA. The hybrid ensemble pruning approach based on combined diversity measures and clustering aims to obtain a multiple classifier system with high predictive performance and better diversity. In this scheme, four different diversity measures (namely, disagreement measure, Q-statistics, the correlation coefficient, and the double fault measure) among classifiers of the ensemble are combined. Based on the combined diversity matrix, a swarm intelligence based clustering algorithm is employed to partition the classifiers into a number of disjoint groups and one classifier (with the highest predictive performance) from each cluster is selected to build the final multiple classifier system. The experimental results based on five biomedical text benchmarks have been conducted. In the swarm-optimized LDA, different metaheuristic algorithms (such as genetic algorithms, particle swarm optimization, firefly algorithm, cuckoo search algorithm, and bat algorithm) are considered. In the ensemble pruning, five metaheuristic clustering algorithms are evaluated. The experimental results on biomedical text benchmarks indicate that swarm-optimized LDA yields better predictive performance compared to the conventional LDA. In addition, the proposed multiple classifier system outperforms the conventional classification algorithms, ensemble learning, and ensemble pruning methods.Entities:
Mesh:
Year: 2018 PMID: 30140300 PMCID: PMC6081524 DOI: 10.1155/2018/2497471
Source DB: PubMed Journal: Comput Math Methods Med ISSN: 1748-670X Impact factor: 2.238
Box 1The generative process of LDA (Blei et al., 2013; [19, 20]).
Figure 1The graphical representation of LDA [22].
Figure 2Swarm-optimized latent Dirichlet allocation.
Figure 3Diversity-based ensemble pruning approach.
Classification algorithms used to build the model library.
|
|
|
|---|---|
| Bayesian Classifiers (5) | Bayesian logistic regression (with Norm-based hyper-parameter selection), Bayesian logistic regression (with Cross-validated hyper-parameter selection), Bayesian logistic regression (with Specific value based hyper-parameter selection), Naive Bayes, Naive Bayes Multinomial |
|
| |
| Function based classifiers (14) | FLDA, Kernel Logistic Regression (with Poly Kernel), Kernel Logistic Regression (with Normalized Poly Kernel), LibLINEAR (with L2-regularized logistic regression), LibLINEAR (with L2-regularized L2-loss support vector classification), LibLINEAR (with L1-regularized logistic regression), LibSVM (with radial basis function), LibSVM (with linear kernel), LibSVM (with polynomial kernel), LibSVM (with sigmoid kernel), Multi-layer perceptron, radial basis function networks, Logistic regression, Gaussian radial basis function networks |
|
| |
| Instance based classifiers (10) | KNN (with K: 1), KNN (with K:2), KNN (with K:3), KNN (with K: 4), KNN (with K:5), KNN (with K:6), KNN (with K:7), KNN (with K:8), KNN (with K:9), KNN (with K:10) |
|
| |
| Rule based classifiers (3) | FURIA (with Product T-norm), FURIA (with Minimum T-norm), RIPPER |
|
| |
| Decision tree classifiers (8) | BFTree (Unpruned), BFTree (Post-pruning), BFTree (Pre-pruning), Functional Tree, C4.5 (J48), NBTree, Random Forest, Random Tree |
Table 2 is reproduced from ONAN et al. [19, 20] (under the Creative Commons Attribution License/public domain).
Box 2The general structure of firefly clustering algorithm.
Descriptive information for the datasets.
|
|
|
|
|
|
|---|---|---|---|---|
| Oh5 | 918 | 3013 | 54.43 | 10 |
| Oh10 | 1050 | 3239 | 55.63 | 10 |
| Oh15 | 3101 | 54142 | 17.46 | 10 |
| Ohscal | 11162 | 11466 | 60.38 | 10 |
| Ohsumed-400 | 9200 | 13512 | 55.14 | 12 |
Parameters of the metaheuristics algorithms utilized in swarm-based LDA.
|
|
|
|---|---|
| Genetic algorithms | Crossover probability: 0.6, The number of generations to evaluate: 20, Mutation probability: 0.033, Population size: 20, Seed: 1 |
|
| |
| Particle swarm optimization | Individual weight: 0.34, Inertia weight: 0.33, Number of iterations: 20, Mutation probability: 0.01, Mutation type: bit-flip, Population size: 20, Seed: 1, Social Weight: 0.33 |
|
| |
| Firefly algorithm |
|
|
| |
| Cuckoo search algorithm |
|
|
| |
| Bat algorithm |
|
Parameters of the metaheuristics algorithms utilized in ensemble pruning.
|
|
|
|---|---|
| Genetic clustering | Crossover probability= 0.6, The number of generations to evaluate= 20, Mutation probability= 0.033, Population size= 20, Seed: 1, k:20 |
|
| |
| Particle swarm clustering | Individual weight: 0.70, |
|
| |
| Firefly clustering |
|
|
| |
| Cuckoo clustering |
|
|
| |
| Bat clustering |
|
Classification accuracies obtained with different LDA-based configurations.
|
|
| |||||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
| oh5 | oh10 | oh15 | ohscal | Ohsu-med | oh5 | oh10 | oh15 | ohscal | Ohsu-med |
|
| ||||||||||
| LDA (k=50) | 74.38 | 66.66 | 69.40 | 59.27 | 28.35 | 76.24 | 78.73 | 83.17 | 70.62 | 34.64 |
| LDA (k=100) | 70.85 | 63.64 | 67.44 | 60.05 | 29.56 | 78.28 | 78.25 | 83.23 | 73.23 | 38.82 |
| LDA (k=150) | 69.02 | 65.24 | 65.51 | 59.01 | 29.43 | 76.72 | 79.09 | 84.74 | 73.8 | 41.27 |
| LDA (k=200) | 66.17 | 64.01 | 63.61 | 58.93 | 27.99 | 77.33 | 77.93 | 84 | 74.19 | 41.82 |
| GA-LDA (BIC) | 75.16 | 67.24 | 74.70 | 71.66 | 35.45 | 77.98 | 69.03 | 75.12 | 73.62 | 35.83 |
| PSO-LDA (BIC) | 75.40 | 68.60 | 76.90 | 72.43 | 35.46 | 78.22 | 72.56 | 75.17 | 75.89 | 36.23 |
| FA-LDA (BIC) | 75.48 | 71.26 | 77.48 | 72.80 | 35.60 | 79.50 | 74.73 | 76.63 | 76.90 | 37.69 |
| CSA-LDA (BIC) | 76.66 | 71.96 | 78.77 | 72.94 | 35.65 | 79.56 | 75.97 | 77.96 | 77.02 | 37.94 |
| BA-LDA (BIC) | 78.82 | 72.21 | 79.77 | 73.02 | 36.58 | 79.85 | 76.53 | 78.89 | 77.34 | 38.89 |
| GA-LDA (CH) | 79.02 | 72.88 | 80.11 | 74.53 | 36.85 | 80.62 | 77.72 | 80.31 | 78.17 | 38.96 |
| PSO-LDA (CH) | 80.20 | 72.93 | 80.66 | 74.76 | 37.03 | 81.50 | 77.91 | 80.50 | 78.99 | 39.03 |
| FA-LDA (CH) | 81.20 | 72.99 | 80.72 | 75.13 | 37.75 | 81.80 | 77.99 | 80.55 | 79.09 | 39.03 |
| CSA-LDA (CH) | 81.40 | 73.12 | 81.71 | 76.02 | 38.34 | 82.61 | 78.01 | 80.78 | 79.82 | 39.03 |
| BA-LDA (CH) | 81.46 | 73.49 | 81.82 | 76.21 | 39.24 | 82.87 | 78.93 | 81.01 | 79.89 | 39.52 |
| GA-LDA (DB) | 84.46 | 76.22 | 84.13 | 78.71 | 40.50 | 84.73 | 80.95 | 85.88 | 82.46 | 43.02 |
| PSO-LDA (DB) | 84.60 | 80.07 | 85.14 | 79.21 | 42.57 | 85.13 | 81.11 | 86.17 | 84.22 | 43.51 |
| FA-LDA (DB) | 85.89 | 80.82 | 85.17 | 80.83 | 44.60 | 86.22 | 81.88 | 86.73 | 84.62 | 44.61 |
| CSA-LDA (DB) |
|
|
|
|
|
|
|
|
|
|
| BA-LDA (DB) |
|
|
|
|
|
|
|
|
|
|
| GA-LDA (SI) | 81.57 | 73.57 | 82.03 | 76.48 | 39.36 | 83.21 | 79.00 | 82.24 | 79.93 | 40.58 |
| PSO-LDA (SI) | 82.61 | 73.76 | 82.50 | 76.61 | 39.66 | 83.58 | 79.33 | 83.03 | 80.36 | 40.87 |
| FA-LDA (SI) | 83.19 | 74.18 | 82.88 | 77.47 | 39.68 | 83.69 | 79.41 | 83.11 | 80.95 | 40.95 |
| CSA-LDA (SI) | 83.78 | 75.11 | 83.01 | 78.06 | 39.69 | 83.84 | 80.83 | 84.47 | 81.82 | 41.12 |
| BA-LDA (SI) | 84.11 | 76.08 | 83.03 | 78.13 | 40.08 | 84.49 | 80.90 | 85.52 | 81.99 | 42.65 |
LDA: latent Dirichlet allocation, GA-LDA: genetic algorithm based LDA, PSO-LDA: particle swarm optimization based LDA, FA-LDA: firefly algorithm based LDA, CSA-LDA: cuckoo search algorithm based LDA, BA-LDA: bat algorithm based LDA, BIC: Bayesian information criterion, CH: Calinski-Harabasz index, DB: Davies-Bouldin index, and SI: Silhouette index.
Classification results obtained by conventional algorithms and the proposed diversity-based ensemble pruning (with LDA (k=50) based representation).
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| NB | 75.19 | 67.43 | 70.77 | 60.24 | 29.41 |
| SVM | 77.59 | 80.29 | 84.47 | 71.58 | 34.72 |
| Bagging+NB | 76.08 | 69.77 | 70.94 | 60.21 | 29.21 |
| Bagging+SVM | 84.36 | 77.20 | 79.07 | 71.92 | 35.98 |
| AdaBoost+NB | 73.53 | 68.07 | 70.26 | 60.09 | 29.60 |
| AdaBoost+SVM | 84.06 | 77.19 | 78.88 | 72.08 | 35.03 |
| RandomSubspace+NB | 74.75 | 67.29 | 68.51 | 57.58 | 28.60 |
| RandomSubspace+SVM | 78.02 | 69.89 | 71.22 | 67.65 | 31.80 |
| Stacking | 83.78 | 81.32 | 81.69 | 60.02 | 40.76 |
| ESM | 79.25 | 79.07 | 78.91 | 72.52 | 37.84 |
| BES | 80.11 | 80.61 | 81.08 | 73.02 | 40.04 |
| LibD3C | 82.86 | 82.93 | 84.51 | 74.86 | 41.17 |
| CDM | 84.77 | 84.13 | 85.32 | 76.45 | 43.55 |
| DEP (Genetic clustering) | 81.61 | 81.96 | 84.64 | 74.21 | 43.27 |
| DEP (PSO clustering) | 80.91 | 81.41 | 83.31 | 73.98 |
|
| DEP (Firefly clustering) |
|
|
|
|
|
| DEP (Cuckoo clustering) |
| 83.00 |
|
| 45.43 |
| DEP (Bat clustering) | 84.47 |
| 82.11 | 72.70 | 44.13 |
NB: Naïve Bayes algorithm, SVM: support vector machines, ESM: ensemble selection from libraries of models, BES: Bagging ensemble selection, LibD3C: hybrid ensemble pruning based on k-means and dynamic selection, CDM: ensemble pruning based on combined diversity measures, and DEP: the proposed diversity-based ensemble pruning.
Comparison of the proposed text categorization scheme with conventional classifiers, ensemble learners, and ensemble pruning method (with BA-LDA (DB) based representation).
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| NB | 87.67 | 81.42 | 87.44 | 83.64 | 47.09 |
| SVM | 88.97 | 82.22 | 88.16 | 85.32 | 50.08 |
| Bagging+NB | 89.32 | 83.35 | 88.87 | 83.47 | 48.52 |
| Bagging+SVM | 88.03 | 84.84 | 87.86 | 83.92 | 50.73 |
| AdaBoost+NB | 89.77 | 83.60 | 87.48 | 86.18 | 51.18 |
| AdaBoost+SVM | 88.18 | 84.95 | 87.35 | 86.29 | 51.85 |
| RandomSubspace+NB | 88.32 | 83.96 | 86.66 | 88.09 | 50.70 |
| RandomSubspace+SVM | 88.56 | 84.11 | 89.58 | 88.29 | 50.29 |
| Stacking | 88.28 | 86.87 | 88.93 | 84.90 | 53.84 |
| ESM | 88.58 | 86.66 | 90.25 | 88.48 | 51.94 |
| BES | 89.29 | 86.00 | 90.98 | 89.12 | 52.47 |
| LibD3C | 90.35 | 87.95 | 91.27 | 90.48 | 53.41 |
| CDM |
|
|
|
|
|
| Proposed scheme |
|
|
|
|
|
NB: Naïve Bayes algorithm, SVM: support vector machines, ESM: ensemble selection from libraries of models, BES: Bagging ensemble selection, LibD3C: hybrid ensemble pruning based on k-means and dynamic selection, and CDM: ensemble pruning based on combined diversity measures.
The macro-averaged F-measure results obtained with different LDA-based configurations.
|
|
| |||||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
| oh5 | oh10 | oh15 | ohscal | Ohsu-med | oh5 | oh10 | oh15 | ohscal | Ohsu-med |
|
| ||||||||||
| LDA (k=50) | 0.75 | 0.68 | 0.71 | 0.61 | 0.30 | 0.77 | 0.80 | 0.85 | 0.73 | 0.36 |
| LDA (k=100) | 0.72 | 0.65 | 0.69 | 0.62 | 0.31 | 0.79 | 0.80 | 0.85 | 0.75 | 0.40 |
| LDA (k=150) | 0.70 | 0.67 | 0.67 | 0.61 | 0.31 | 0.77 | 0.81 | 0.86 | 0.76 | 0.43 |
| LDA (k=200) | 0.67 | 0.65 | 0.65 | 0.61 | 0.29 | 0.78 | 0.80 | 0.86 | 0.76 | 0.44 |
| GA-LDA (BIC) | 0.76 | 0.69 | 0.76 | 0.74 | 0.37 | 0.79 | 0.70 | 0.77 | 0.76 | 0.37 |
| PSO-LDA (BIC) | 0.76 | 0.70 | 0.78 | 0.75 | 0.37 | 0.79 | 0.74 | 0.77 | 0.78 | 0.38 |
| FA-LDA (BIC) | 0.76 | 0.73 | 0.79 | 0.75 | 0.37 | 0.80 | 0.76 | 0.78 | 0.79 | 0.39 |
| CSA-LDA (BIC) | 0.77 | 0.73 | 0.80 | 0.75 | 0.37 | 0.80 | 0.78 | 0.80 | 0.79 | 0.40 |
| BA-LDA (BIC) | 0.80 | 0.74 | 0.81 | 0.75 | 0.38 | 0.81 | 0.78 | 0.81 | 0.80 | 0.41 |
| GA-LDA (CH) | 0.80 | 0.74 | 0.82 | 0.77 | 0.38 | 0.81 | 0.79 | 0.82 | 0.81 | 0.41 |
| PSO-LDA (CH) | 0.81 | 0.74 | 0.82 | 0.77 | 0.39 | 0.82 | 0.79 | 0.82 | 0.81 | 0.41 |
| FA-LDA (CH) | 0.82 | 0.74 | 0.82 | 0.77 | 0.39 | 0.83 | 0.80 | 0.82 | 0.82 | 0.41 |
| CSA-LDA (CH) | 0.82 | 0.75 | 0.83 | 0.78 | 0.40 | 0.83 | 0.80 | 0.82 | 0.82 | 0.41 |
| BA-LDA (CH) | 0.82 | 0.75 | 0.83 | 0.79 | 0.41 | 0.84 | 0.81 | 0.83 | 0.82 | 0.41 |
| GA-LDA (DB) | 0.85 | 0.78 | 0.86 | 0.81 | 0.42 | 0.86 | 0.83 | 0.88 | 0.85 | 0.45 |
| PSO-LDA (DB) | 0.85 |
| 0.87 | 0.82 | 0.44 | 0.86 | 0.83 | 0.88 | 0.87 | 0.45 |
| FA-LDA (DB) |
|
| 0.87 | 0.83 | 0.46 | 0.87 |
|
| 0.87 | 0.46 |
| CSA-LDA (DB) |
|
|
|
|
|
|
|
|
|
|
| BA-LDA (DB) |
|
|
|
|
|
|
|
|
|
|
| GA-LDA (SI) | 0.82 | 0.75 | 0.84 | 0.79 | 0.41 | 0.84 | 0.81 | 0.84 | 0.82 | 0.42 |
| PSO-LDA (SI) | 0.83 | 0.75 | 0.84 | 0.79 | 0.41 | 0.84 | 0.81 | 0.85 | 0.83 | 0.43 |
| FA-LDA (SI) | 0.84 | 0.76 | 0.85 | 0.80 | 0.41 | 0.85 | 0.81 | 0.85 | 0.83 | 0.43 |
| CSA-LDA (SI) | 0.85 | 0.77 | 0.85 | 0.80 | 0.41 | 0.85 | 0.82 | 0.86 | 0.84 | 0.43 |
| BA-LDA (SI) | 0.85 | 0.78 | 0.85 | 0.81 | 0.42 | 0.85 | 0.83 | 0.87 | 0.85 | 0.44 |
LDA: latent Dirichlet allocation, GA-LDA: genetic algorithm based LDA, PSO-LDA: particle swarm optimization based LDA, FA-LDA: firefly algorithm based LDA, CSA-LDA: cuckoo search algorithm based LDA, BA-LDA: bat algorithm based LDA, BIC: Bayesian information criterion, CH: Calinski-Harabasz index, DB: Davies-Bouldin index, and SI: Silhouette index.
The macro-averaged F-measure results obtained by conventional algorithms and the proposed diversity-based ensemble pruning (with LDA (k=50) based representation).
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| NB | 0.76 | 0.68 | 0.72 | 0.61 | 0.30 |
| SVM | 0.78 | 0.81 | 0.86 | 0.73 | 0.35 |
| Bagging+NB | 0.77 | 0.70 | 0.72 | 0.61 | 0.30 |
| Bagging+SVM | 0.85 | 0.78 | 0.81 | 0.73 | 0.37 |
| AdaBoost+NB | 0.74 | 0.69 | 0.72 | 0.61 | 0.31 |
| AdaBoost+SVM | 0.85 | 0.78 | 0.80 | 0.74 | 0.36 |
| RandomSubspace+NB | 0.76 | 0.68 | 0.70 | 0.59 | 0.29 |
| RandomSubspace+SVM | 0.79 | 0.71 | 0.73 | 0.69 | 0.33 |
| Stacking | 0.84 | 0.80 | 0.81 | 0.72 | 0.38 |
| ESM | 0.80 | 0.81 | 0.81 | 0.74 | 0.39 |
| BES | 0.81 | 0.82 | 0.83 | 0.75 | 0.41 |
| LibD3C | 0.84 | 0.85 | 0.86 | 0.76 | 0.42 |
| CDM |
|
|
|
| 0.45 |
| DEP (Genetic clustering) | 0.82 | 0.84 | 0.86 | 0.76 | 0.45 |
| DEP (PSO clustering) | 0.82 | 0.83 | 0.85 | 0.75 |
|
| DEP (Firefly clustering) |
|
|
|
|
|
| DEP (Cuckoo clustering) |
| 0.85 |
|
|
|
| DEP (Bat clustering) | 0.85 |
| 0.84 | 0.74 | 0.45 |
NB: Naïve Bayes algorithm, SVM: support vector machines, ESM: ensemble selection from libraries of models, BES: Bagging ensemble selection, LibD3C: hybrid ensemble pruning based on k-means and dynamic selection, CDM: ensemble pruning based on combined diversity measures, and DEP: the proposed diversity-based ensemble pruning.
The macro-averaged F-measure results of methods (with BA-LDA (DB) based representation).
|
|
|
|
|
|
|
|---|---|---|---|---|---|
| NB | 0.89 | 0.82 | 0.88 | 0.84 | 0.48 |
| SVM | 0.90 | 0.83 | 0.89 | 0.86 | 0.51 |
| Bagging+NB | 0.90 | 0.84 | 0.90 | 0.84 | 0.49 |
| Bagging+SVM | 0.89 | 0.86 | 0.89 | 0.85 | 0.51 |
| AdaBoost+NB | 0.91 | 0.84 | 0.88 | 0.87 | 0.52 |
| AdaBoost+SVM | 0.89 | 0.86 | 0.88 | 0.87 | 0.52 |
| RandomSubspace+NB | 0.90 | 0.86 | 0.88 | 0.90 | 0.52 |
| RandomSubspace+SVM | 0.90 | 0.86 | 0.91 | 0.90 | 0.51 |
| Stacking | 0.90 | 0.87 | 0.91 | 0.88 | 0.54 |
| ESM | 0.90 | 0.88 | 0.92 | 0.90 | 0.53 |
| BES | 0.93 | 0.90 | 0.95 | 0.93 | 0.55 |
| LibD3C | 0.94 | 0.92 | 0.95 | 0.94 | 0.56 |
| CDM |
|
|
|
|
|
| Proposed scheme |
|
|
|
|
|
NB: Naïve Bayes algorithm, SVM: support vector machines, ESM: ensemble selection from libraries of models, BES: Bagging ensemble selection, LibD3C: hybrid ensemble pruning based on k-means and dynamic selection, and CDM: ensemble pruning based on combined diversity measures.
Two-way ANOVA test results of classification accuracy values.
|
| |||||
|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
| |||||
| Configuration | 23 | 4073.9 | 177.1 | 90.50 | P<0.001 |
| Dataset | 4 | 60336.7 | 15084.2 | 7707.50 | P<0.001 |
| Classifier | 1 | 881.0 | 881.0 | 450.15 | P<0.001 |
| Configuration | 92 | 334.0 | 3.6 | 1.85 | P<0.001 |
| Configuration | 23 | 932.9 | 40.6 | 20.73 | P<0.001 |
| Dataset | 4 | 106.3 | 26.6 | 13.57 | P<0.001 |
| Error | 92 | 180.1 | 2.0 | ||
| Total | 239 | 66844.8 | |||
|
| |||||
|
| |||||
|
| |||||
|
|
|
|
|
|
|
|
| |||||
| Configuration | 17 | 2691.7 | 158.34 | 25.86 | P<0.001 |
| Dataset | 4 | 23128.7 | 5782.17 | 944.48 | P<0.001 |
| Error | 68 | 416.3 | 6.12 | ||
| Total | 89 | ||||
|
| |||||
|
| |||||
|
| |||||
|
|
|
|
|
|
|
|
| |||||
| Configuration | 13 | 324.5 | 24.96 | 17.81 | P<0.001 |
| Dataset | 4 | 14736.0 | 3684.00 | 2628.98 | P<0.001 |
| Error | 52 | 72.9 | 1.40 | ||
| Total | 69 | 15133.4 | |||
Two-way ANOVA test results of the macro-averaged F-measure.
|
| |||||
|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
| |||||
| Configuration | 23 | 0.42777 | 0.01860 | 91.27 | P<0.001 |
| Dataset | 4 | 5.99867 | 1.49967 | 7359.42 | P<0.001 |
| Classifier | 1 | 0.09263 | 0.09263 | 454.58 | P<0.001 |
| Configuration | 92 | 0.03536 | 0.00038 | 1.89 | P<0.001 |
| Configuration | 23 | 0.09800 | 0.00426 | 20.91 | P<0.001 |
| Dataset | 4 | 0.01123 | 0.00281 | 13.78 | P<0.001 |
| Error | 92 | 0.01875 | 0.00020 | ||
| Total | 239 | 6.68241 | |||
|
| |||||
|
| |||||
|
| |||||
|
|
|
|
|
|
|
|
| |||||
| Configuration | 17 | 0.27733 | 0.016314 | 23.26 | P<0.001 |
| Dataset | 4 | 2.41143 | 0.692858 | 859.46 | P<0.001 |
| Error | 68 | 0.04770 | 0.000701 | ||
| Total | 89 | 2.73646 | |||
|
| |||||
|
| |||||
|
| |||||
|
|
|
|
|
|
|
|
| |||||
| Configuration | 13 | 0.03613 | 0.002780 | 14.68 | P<0.001 |
| Dataset | 4 | 1.53718 | 0.384296 | 2029.89 | P<0.001 |
| Error | 52 | 0.00984 | 0.000189 | ||
| Total | 69 | 1.58316 | |||
Figure 4Interval plots for compared LDA-based configurations.
Figure 5Interval plots for classifiers and ensemble pruning methods.
Figure 6Interval plots for compared algorithms.
Figure 7Average execution times (in seconds) for compared algorithms.