Literature DB >> 28777723

A Robust Regression Framework with Laplace Kernel-Induced Loss.

Liming Yang1, Zhuo Ren2, Yidan Wang3, Hongwei Dong4.   

Abstract

This work proposes a robust regression framework with nonconvex loss function. Two regression formulations are presented based on the Laplace kernel-induced loss (LK-loss). Moreover, we illustrate that the LK-loss function is a nice approximation for the zero-norm. However, nonconvexity of the LK-loss makes it difficult to optimize. A continuous optimization method is developed to solve the proposed framework. The problems are formulated as DC (difference of convex functions) programming. The corresponding DC algorithms (DCAs) converge linearly. Furthermore, the proposed algorithms are applied directly to determine the hardness of licorice seeds using near-infrared spectral data with noisy input. Experiments in eight spectral regions show that the proposed methods improve generalization compared with the traditional support vector regressions (SVR), especially in high-frequency regions. Experiments on several benchmark data sets demonstrate that the proposed methods achieve better results than the traditional regression methods in most of data sets we have considered.

Entities:  

Year:  2017        PMID: 28777723     DOI: 10.1162/neco_a_01002

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  2 in total

1.  On Regularization Based Twin Support Vector Regression with Huber Loss.

Authors:  Umesh Gupta; Deepak Gupta
Journal:  Neural Process Lett       Date:  2021-01-03       Impact factor: 2.908

2.  Relative Entropy of Correct Proximal Policy Optimization Algorithms with Modified Penalty Factor in Complex Environment.

Authors:  Weimin Chen; Kelvin Kian Loong Wong; Sifan Long; Zhili Sun
Journal:  Entropy (Basel)       Date:  2022-03-22       Impact factor: 2.524

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.