Literature DB >> 18249779

k-nearest neighbors directed noise injection in multilayer perceptron training.

M Skurichina1, S Raudys, R W Duin.   

Abstract

The relation between classifier complexity and learning set size is very important in discriminant analysis. One of the ways to overcome the complexity control problem is to add noise to the training objects, increasing in this way the size of the training set. Both the amount and the directions of noise injection are important factors which determine the effectiveness for classifier training. In this paper the effect is studied of the injection of Gaussian spherical noise and -nearest neighbors directed noise on the performance of multilayer perceptrons. As it is impossible to provide an analytical investigation for multilayer perceptrons, a theoretical analysis is made for statistical classifiers. The goal is to get a better understanding of the effect of noise injection on the accuracy of sample-based classifiers. By both empirical as well as theoretical studies, it is shown that the -nearest neighbors directed noise injection is preferable over the Gaussian spherical noise injection for data with low intrinsic dimensionality.

Year:  2000        PMID: 18249779     DOI: 10.1109/72.839019

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  1 in total

1.  Noise-injected neural networks show promise for use on small-sample expression data.

Authors:  Jianping Hua; James Lowey; Zixiang Xiong; Edward R Dougherty
Journal:  BMC Bioinformatics       Date:  2006-05-31       Impact factor: 3.169

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.