| Literature DB >> 35004208 |
Carlos Eduardo da Silva Santos1,2, Leandro Dos Santos Coelho1,3, Carlos Humberto Llanos1.
Abstract
Support Vector Machines (SVMs) technique for achieving classifiers and regressors. However, to obtain models with high accuracy and low complexity, it is necessary to define the kernel parameters as well as the parameters of the training model, which are called hyperparameters. The challenge of defining the more suitable value to hyperparameters is called the Parameter Selection Problem (PSP). However, minimizing the complexity and maximizing the generalization capacity of the SVMs are conflicting criteria. Therefore, we propose the Nature Inspired Optimization Tools for SVMs (NIOTS) that offers a method to automate the search process for the best possible solution for the PSP, allowing the user to quickly obtain several sets of good solutions and choose the one most appropriate for his specific problem.•The PSP has been modeled as a Multiobjective Optimization Problem (MOP) with two objectives: (1) good precision and (2) low complexity (low number of support vectors).•The user can evaluate multiple solutions included in the Pareto front, in terms of precision and low complexity of the model.•Apart from the Adaptive Parameter with Mutant Tournament Multiobjective Differential Evolution (APMT-MODE), the user can choose other metaheuristics and also among several kernel options.Entities:
Keywords: Adaptive parameters control; Differential evolution algorithm; Multi-objective optimization problem; Parameters selection problem; Support vectors machines
Year: 2021 PMID: 35004208 PMCID: PMC8720899 DOI: 10.1016/j.mex.2021.101574
Source DB: PubMed Journal: MethodsX ISSN: 2215-0161
Kernel parameters.
| Kernel | Kernel function | Parameters |
|---|---|---|
| Polinomial | ||
| Gaussian | ||
| Arc cosine (Cho e Saul 2009) | ||
| Cauchy (Drewnik e Pasternak-Winiarski 2017) |
Fig. 1Machine learning flowchart.
Fig. 2NIOTS optimization interface.
NIOTS description options on optimization panel
| Panel | Options | Descriptions |
|---|---|---|
| Machine definition | Model | SVM/LibSVM |
| SVR/LiBSVM | ||
| Grid Search SVR | ||
| Grid Search SVM | ||
| Optimization Algorithm | MOPSO | |
| APMT-MODE | ||
| Kernel | RBF | |
| Polynomial | ||
| Acos | ||
| Cauchy | ||
| General parameters | Samples | Independent experiments amount. |
| Iterations | Optimization algorithm iterations amount. | |
| Cross-validation | Enable the cross-validation training/validation process. | |
| Random distribution | Uniform | |
| Normal | ||
| Cauchy | ||
| Search space | The | |
| The | ||
| The | ||
| Metaheuristics parameters: MOPSO | The swarm size | The swarm particles amount |
| Initial inertia | The PSO Initial inertia factor | |
| Final inertia | The PSO Final inertia factor | |
| Cognitive coefficient | The particle confidence level is in its own best position. | |
| Social coefficient | The particle confidence level is the best particle position overall. | |
| The particle max speed | The particle max speed allowed by PSO. | |
| Metaheuristic parameters: AP/APMT-MODE | Population size | The Differential Evolution (DE) individuals amount. |
| Scale factor | Initial scale factor. | |
| Crossover rate | Initial crossover rate. |
Fig. 3NIOTS input file example
Fig. 4NIOTS flowchart options and dataflow
Fig. 5The models report file
Fig. 6NIOTS report file for SVR, first three columns are Pareto set and the two last one are Pareto front
| Subject Area | Computer Science |
| More specific subject area | Support Vector Machines (SVMs), Machine Learning, and Pattern recognition. |
| Method name | NIOTS: A flexible method for obtaining hyperparameters for SVMs via nature inspired metaheuristics, giving flexibility to the user in choosing the kernel, the metaheuristic to be used and the result that best suits the specific problem. |
| Name and reference of original method | Multi-objective adaptive differential evolution for SVM/SVR hyperparameters selection ( |
| Resource availability |