Literature DB >> 19054735

A kernel-induced space selection approach to model selection in KLDA.

Lei Wang1, Kap Luk Chan, Ping Xue, Luping Zhou.   

Abstract

Model selection in kernel linear discriminant analysis (KLDA) refers to the selection of appropriate parameters of a kernel function and the regularizer. By following the principle of maximum information preservation, this paper formulates the model selection problem as a problem of selecting an optimal kernel-induced space in which different classes are maximally separated from each other. A scatter-matrix-based criterion is developed to measure the "goodness" of a kernel-induced space, and the kernel parameters are tuned by maximizing this criterion. This criterion is computationally efficient and is differentiable with respect to the kernel parameters. Compared with the leave-one-out (LOO) or k-fold cross validation (CV), the proposed approach can achieve a faster model selection, especially when the number of training samples is large or when many kernel parameters need to be tuned. To tune the regularization parameter in the KLDA, our criterion is used together with the method proposed by Saadi (2004). Experiments on benchmark data sets verify the effectiveness of this model selection approach.

Mesh:

Year:  2008        PMID: 19054735     DOI: 10.1109/TNN.2008.2005140

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  2 in total

1.  Kernel optimization in discriminant analysis.

Authors:  Di You; Onur C Hamsici; Aleix M Martinez
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2011-03       Impact factor: 6.226

2.  A Classification Algorithm by Combination of Feature Decomposition and Kernel Discriminant Analysis (KDA) for Automatic MR Brain Image Classification and AD Diagnosis.

Authors:  Farzaneh Elahifasaee; Fan Li; Ming Yang
Journal:  Comput Math Methods Med       Date:  2019-12-30       Impact factor: 2.238

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.