Literature DB >> 25751875

Very sparse LSSVM reductions for large-scale data.

Raghvendra Mall, Johan A K Suykens.   

Abstract

Least squares support vector machines (LSSVMs) have been widely applied for classification and regression with comparable performance with SVMs. The LSSVM model lacks sparsity and is unable to handle large-scale data due to computational and memory constraints. A primal fixed-size LSSVM (PFS-LSSVM) introduce sparsity using Nyström approximation with a set of prototype vectors (PVs). The PFS-LSSVM model solves an overdetermined system of linear equations in the primal. However, this solution is not the sparsest. We investigate the sparsity-error tradeoff by introducing a second level of sparsity. This is done by means of L0 -norm-based reductions by iteratively sparsifying LSSVM and PFS-LSSVM models. The exact choice of the cardinality for the initial PV set is not important then as the final model is highly sparse. The proposed method overcomes the problem of memory constraints and high computational costs resulting in highly sparse reductions to LSSVM models. The approximations of the two models allow to scale the models to large-scale datasets. Experiments on real-world classification and regression data sets from the UCI repository illustrate that these approaches achieve sparse models without a significant tradeoff in errors.

Year:  2015        PMID: 25751875     DOI: 10.1109/TNNLS.2014.2333879

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  1 in total

1.  Modelling Compression Strength of Waste PET and SCM Blended Cementitious Grout Using Hybrid of LSSVM Models.

Authors:  Kaffayatullah Khan; Jitendra Gudainiyan; Mudassir Iqbal; Arshad Jamal; Muhammad Nasir Amin; Ibrahim Mohammed; Majdi Adel Al-Faiad; Abdullah M Abu-Arab
Journal:  Materials (Basel)       Date:  2022-07-29       Impact factor: 3.748

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.