| Literature DB >> 32836044 |
Hongwei Jiang1, Bin Zou2, Chen Xu3, Jie Xu4, Yuan Yan Tang5.
Abstract
In this article we introduce the idea of Markov resampling for Boosting methods. We first prove that Boosting algorithm with general convex loss function based on uniformly ergodic Markov chain (u.e.M.c.) examples is consistent and establish its fast convergence rate. We apply Boosting algorithm based on Markov resampling to Support Vector Machine (SVM), and introduce two new resampling-based Boosting algorithms: SVM-Boosting based on Markov resampling (SVM-BM) and improved SVM-Boosting based on Markov resampling (ISVM-BM). In contrast with SVM-BM, ISVM-BM uses the support vectors to calculate the weights of base classifiers. The numerical studies based on benchmark datasets show that the proposed two resampling-based SVM Boosting algorithms for linear base classifiers have smaller misclassification rates, less total time of sampling and training compared to three classical AdaBoost algorithms: Gentle AdaBoost, Real AdaBoost, Modest AdaBoost. In addition, we compare the proposed SVM-BM algorithm with the widely used and efficient gradient Boosting algorithm-XGBoost (eXtreme Gradient Boosting), SVM-AdaBoost and present some useful discussions on the technical parameters.Keywords: Boosting; Consistency; Resampling; Uniformly ergodic Markov chain (u.e.M.c.)
Mesh:
Year: 2020 PMID: 32836044 DOI: 10.1016/j.neunet.2020.07.036
Source DB: PubMed Journal: Neural Netw ISSN: 0893-6080