Literature DB >> 24708370

Universal approximation depth and errors of narrow belief networks with discrete units.

Guido F Montúfar1.   

Abstract

We generalize recent theoretical work on the minimal number of layers of narrow deep belief networks that can approximate any probability distribution on the states of their visible units arbitrarily well. We relax the setting of binary units (Sutskever & Hinton, 2008 ; Le Roux & Bengio, 2008 , 2010 ; Montúfar & Ay, 2011 ) to units with arbitrary finite state spaces and the vanishing approximation error to an arbitrary approximation error tolerance. For example, we show that a q-ary deep belief network with L > or = 2 + (q[m-delta]-1 / (q-1)) layers of width n < or = + log(q) (m) + 1 for some [Formula : see text] can approximate any probability distribution on {0, 1, ... , q-1}n without exceeding a Kullback-Leibler divergence of delta. Our analysis covers discrete restricted Boltzmann machines and naive Bayes models as special cases.

Mesh:

Year:  2014        PMID: 24708370     DOI: 10.1162/NECO_a_00601

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  2 in total

1.  Adaptation to criticality through organizational invariance in embodied agents.

Authors:  Miguel Aguilera; Manuel G Bedia
Journal:  Sci Rep       Date:  2018-05-16       Impact factor: 4.379

2.  Exploring Criticality as a Generic Adaptive Mechanism.

Authors:  Miguel Aguilera; Manuel G Bedia
Journal:  Front Neurorobot       Date:  2018-10-02       Impact factor: 2.650

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.