| Literature DB >> 29924830 |
Ao Li1,2, Deyun Chen1, Zhiqiang Wu2,3, Guanglu Sun1, Kezheng Lin1.
Abstract
Recently, sparse representation, which relies on the underlying assumption that samples can be sparsely represented by their labeled neighbors, has been applied with great success to image classification problems. Through sparse representation-based classification (SRC), the label can be assigned with minimum residual between the sample and its synthetic version with class-specific coding, which means that the coding scheme is the most significant factor for classification accuracy. However, conventional SRC-based coding schemes ignore dependency among the samples, which leads to an undesired result that similar samples may be coded into different categories due to quantization sensitivity. To address this problem, in this paper, a novel approach based on self-supervised sparse representation is proposed for image classification. In the proposed approach, the manifold structure of samples is firstly exploited with low rank representation. Next, the low-rank representation matrix is used to characterize the similarity of samples in order to establish a self-supervised sparse coding model, which aims to preserve the local structure of codings for similar samples. Finally, a numerical algorithm utilizing the alternating direction method of multipliers (ADMM) is developed to obtain the approximate solution. Experiments on several publicly available datasets validate the effectiveness and efficiency of our proposed approach compared with existing state-of-the-art methods.Entities:
Mesh:
Year: 2018 PMID: 29924830 PMCID: PMC6010279 DOI: 10.1371/journal.pone.0199141
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Algorithm for solving objective function in Eq (6).
| Algorithm 1 |
|---|
| Initial: Dataset |
Overall algorithm for proposed coding scheme.
| Algorithm 2 |
|---|
| Initial: test dataset |
Fig 1Classification accuracy of comparison methods on Extended YaleB.
Fig 5Classification accuracy of comparison methods on USPS.
Fig 6The coding matrices of Extended YaleB with different coding scheme.
Top row: visulization of coding matrices of SRC(left) and LLC(right). Bottom row: visulization of coding matrices of ProCR(left) and our proposed method(right). The visualization results are obtained under the experiments with 50 percent of training samples.
Fig 7Visualization results on Extended YaleB.
The original face samples are shown in the left, the synthesis face samples are shown in the middle and the residual errors are shown in the right.
Fig 8The convergence curve of Extended YaleB.