Literature DB >> 18249784

Taking on the curse of dimensionality in joint distributions using neural networks.

S Bengio1, Y Bengio.   

Abstract

The curse of dimensionality is severe when modeling high-dimensional discrete data: the number of possible combinations of the variables explodes exponentially. In this paper, we propose a new architecture for modeling high-dimensional data that requires resources (parameters and computations) that grow at most as the square of the number of variables, using a multilayer neural network to represent the joint distribution of the variables as the product of conditional distributions. The neural network can be interpreted as a graphical model without hidden random variables, but in which the conditional distributions are tied through the hidden units. The connectivity of the neural network can be pruned by using dependency tests between the variables (thus reducing significantly the number of parameters). Experiments on modeling the distribution of several discrete data sets show statistically significant improvements over other methods such as naive Bayes and comparable Bayesian networks and show that significant improvements can be obtained by pruning the network.

Year:  2000        PMID: 18249784     DOI: 10.1109/72.846725

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw        ISSN: 1045-9227


  2 in total

1.  A novel single neuron perceptron with universal approximation and XOR computation properties.

Authors:  Ehsan Lotfi; M-R Akbarzadeh-T
Journal:  Comput Intell Neurosci       Date:  2014-04-28

2.  Cascaded neural networks improving fish species prediction accuracy: the role of the biotic information.

Authors:  Simone Franceschini; Emanuele Gandola; Marco Martinoli; Lorenzo Tancioni; Michele Scardi
Journal:  Sci Rep       Date:  2018-03-15       Impact factor: 4.379

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.