Literature DB >> 12662675

Artificial neural networks as approximators of stochastic processes.

M R. Belli1, M Conti, P Crippa, C Turchetti.   

Abstract

Artificial Neural Networks (ANNs) must be able to learn by experience from environment. This property can be considered as being closely related to the approximating capabilities of the networks. Unfortunately at present only the ability of ANNs in approximating deterministic input-output mappings has been exploited. In this article it has been shown that some classes of neural networks, named Stochastic Neural Networks, which are capable of using approximating stochastic processes are defined. As stochastic processes may also be viewed as random functions, they include deterministic (non-random) functions as a particular case. Thus the class of Stochastic Neural Networks can be considered as a generalisation of the usually defined neural networks. From an application point of view such a class of networks is more adherent to real world in which neural networks must work in an environment which is essentially stochastic. The theory presented in the article has been carried out starting from the so-called "canonical representation" for non-stationary stochastic processes. Finally, an application example showing in detail the validity of the proposed approach has been reported.

Year:  1999        PMID: 12662675     DOI: 10.1016/s0893-6080(99)00017-9

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  1 in total

1.  Deep Residual Learning for Nonlinear Regression.

Authors:  Dongwei Chen; Fei Hu; Guokui Nian; Tiantian Yang
Journal:  Entropy (Basel)       Date:  2020-02-07       Impact factor: 2.524

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.