Literature DB >> 25720002

Scaling up graph-based semisupervised learning via prototype vector machines.

Kai Zhang, Liang Lan, James T Kwok, Slobodan Vucetic, Bahram Parvin.   

Abstract

When the amount of labeled data are limited, semisupervised learning can improve the learner's performance by also using the often easily available unlabeled data. In particular, a popular approach requires the learned function to be smooth on the underlying data manifold. By approximating this manifold as a weighted graph, such graph-based techniques can often achieve state-of-the-art performance. However, their high time and space complexities make them less attractive on large data sets. In this paper, we propose to scale up graph-based semisupervised learning using a set of sparse prototypes derived from the data. These prototypes serve as a small set of data representatives, which can be used to approximate the graph-based regularizer and to control model complexity. Consequently, both training and testing become much more efficient. Moreover, when the Gaussian kernel is used to define the graph affinity, a simple and principled method to select the prototypes can be obtained. Experiments on a number of real-world data sets demonstrate encouraging performance and scaling properties of the proposed approach. It also compares favorably with models learned via l1 -regularization at the same level of model sparsity. These results demonstrate the efficacy of the proposed approach in producing highly parsimonious and accurate models for semisupervised learning.

Entities:  

Mesh:

Year:  2015        PMID: 25720002      PMCID: PMC4347954          DOI: 10.1109/TNNLS.2014.2315526

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  9 in total

1.  Nonlinear dimensionality reduction by locally linear embedding.

Authors:  S T Roweis; L K Saul
Journal:  Science       Date:  2000-12-22       Impact factor: 47.728

2.  A global geometric framework for nonlinear dimensionality reduction.

Authors:  J B Tenenbaum; V de Silva; J C Langford
Journal:  Science       Date:  2000-12-22       Impact factor: 47.728

3.  Spectral grouping using the Nyström method.

Authors:  Charless Fowlkes; Serge Belongie; Fan Chung; Jitendra Malik
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2004-02       Impact factor: 6.226

4.  Clustered Nyström method for large scale manifold learning and dimension reduction.

Authors:  Kai Zhang; James T Kwok
Journal:  IEEE Trans Neural Netw       Date:  2010-08-30

5.  Discriminative semi-supervised feature selection via manifold regularization.

Authors:  Zenglin Xu; Irwin King; Michael Rung-Tsong Lyu; Rong Jin
Journal:  IEEE Trans Neural Netw       Date:  2010-06-21

6.  New semi-supervised classification method based on modified cluster assumption.

Authors:  Yunyun Wang; Songcan Chen; Zhi-Hua Zhou
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2012-05       Impact factor: 10.451

7.  Semisupervised classification with cluster regularization.

Authors:  Rodrigo G F Soares; Huanhuan Chen; Xin Yao
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2012-11       Impact factor: 10.451

8.  Semi-supervised dimension reduction using trace ratio criterion.

Authors:  Yi Huang; Dong Xu; Feiping Nie
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2012-03       Impact factor: 10.451

9.  Laplacian embedded regression for scalable manifold regularization.

Authors:  Lin Chen; Ivor W Tsang; Dong Xu
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2012-06       Impact factor: 10.451

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.