Literature DB >> 18249847

Anisotropic noise injection for input variables relevance determination.

Y Grandvalet1.   

Abstract

There are two archetypal ways to control the complexity of a flexible regressor: subset selection and ridge regression. In neural-networks jargon, they are, respectively, known as pruning and weight decay. These techniques may also be adapted to estimate which features of the input space are relevant for predicting the output variables. Relevance is given by a binary indicator for subset selection, and by a continuous rating for ridge regression. This paper shows how to achieve such a rating for a multilayer perceptron trained with noise (or jitter). Noise injection (NI) is modified in order to penalize heavily irrelevant features. The proposed algorithm is attractive as it requires the tuning of a single parameter. This parameter controls the complexity of the model (effective number of parameters) together with the rating of feature relevances (effective input space dimension). Bounds on the effective number of parameters support that the stability of this adaptive scheme is enforced by the constraints applied to the admissible set of relevance indices. The good properties of the algorithm are confirmed by satisfactory experimental results on simulated data sets.

Year:  2000        PMID: 18249847     DOI: 10.1109/72.883393

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  1 in total

1.  Noise-injected neural networks show promise for use on small-sample expression data.

Authors:  Jianping Hua; James Lowey; Zixiang Xiong; Edward R Dougherty
Journal:  BMC Bioinformatics       Date:  2006-05-31       Impact factor: 3.169

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.