Literature DB >> 18276403

Using additive noise in back-propagation training.

L Holmstrom1, P Koistinen.   

Abstract

The possibility of improving the generalization capability of a neural network by introducing additive noise to the training samples is discussed. The network considered is a feedforward layered neural network trained with the back-propagation algorithm. Back-propagation training is viewed as nonlinear least-squares regression and the additive noise is interpreted as generating a kernel estimate of the probability density that describes the training vector distribution. Two specific application types are considered: pattern classifier networks and estimation of a nonstochastic mapping from data corrupted by measurement errors. It is not proved that the introduction of additive noise to the training vectors always improves network generalization. However, the analysis suggests mathematically justified rules for choosing the characteristics of noise if additive noise is used in training. Results of mathematical statistics are used to establish various asymptotic consistency results for the proposed method. Numerical simulations support the applicability of the training method.

Year:  1992        PMID: 18276403     DOI: 10.1109/72.105415

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  8 in total

1.  Variable Selection in Kernel Regression Using Measurement Error Selection Likelihoods.

Authors:  Kyle R White; Leonard A Stefanski; Yichao Wu
Journal:  J Am Stat Assoc       Date:  2017-07-19       Impact factor: 5.033

2.  Noise injection for training artificial neural networks: a comparison with weight decay and early stopping.

Authors:  Richard M Zur; Yulei Jiang; Lorenzo L Pesce; Karen Drukker
Journal:  Med Phys       Date:  2009-10       Impact factor: 4.071

3.  An auditory localization model based on high-frequency spectral cues.

Authors:  D Nandy; J Ben-Arie
Journal:  Ann Biomed Eng       Date:  1996 Nov-Dec       Impact factor: 3.934

4.  Multiobjective optimization for model selection in kernel methods in regression.

Authors:  Di You; Carlos Fabian Benitez-Quiroz; Aleix M Martinez
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2014-10       Impact factor: 10.451

5.  Comprehensive Analysis of Multiple Cohort Datasets Deciphers the Utility of Germline Single-Nucleotide Polymorphisms in Prostate Cancer Diagnosis.

Authors:  Wensheng Zhang; Yan Dong; Oliver Sartor; Kun Zhang
Journal:  Cancer Prev Res (Phila)       Date:  2021-04-17

6.  Automated in-silico detection of cell populations in flow cytometry readouts and its application to leukemia disease monitoring.

Authors:  Joern Toedling; Peter Rhein; Richard Ratei; Leonid Karawajew; Rainer Spang
Journal:  BMC Bioinformatics       Date:  2006-06-05       Impact factor: 3.169

7.  Noise-injected neural networks show promise for use on small-sample expression data.

Authors:  Jianping Hua; James Lowey; Zixiang Xiong; Edward R Dougherty
Journal:  BMC Bioinformatics       Date:  2006-05-31       Impact factor: 3.169

8.  Data augmentation based malware detection using convolutional neural networks.

Authors:  Ferhat Ozgur Catak; Javed Ahmed; Kevser Sahinbas; Zahid Hussain Khand
Journal:  PeerJ Comput Sci       Date:  2021-01-22
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.