Literature DB >> 12628609

Kalman filters improve LSTM network performance in problems unsolvable by traditional recurrent nets.

Juan Antonio Pérez-Ortiz1, Felix A Gers, Douglas Eck, Jürgen Schmidhuber.   

Abstract

The long short-term memory (LSTM) network trained by gradient descent solves difficult problems which traditional recurrent neural networks in general cannot. We have recently observed that the decoupled extended Kalman filter training algorithm allows for even better performance, reducing significantly the number of training steps when compared to the original gradient descent training algorithm. In this paper we present a set of experiments which are unsolvable by classical recurrent networks but which are solved elegantly and robustly and quickly by LSTM combined with Kalman filters.

Mesh:

Year:  2003        PMID: 12628609     DOI: 10.1016/S0893-6080(02)00219-8

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  5 in total

1.  A generalized LSTM-like training algorithm for second-order recurrent neural networks.

Authors:  Derek Monner; James A Reggia
Journal:  Neural Netw       Date:  2011-07-18

2.  Energy Management Strategy Based on a Novel Speed Prediction Method.

Authors:  Jiaming Xing; Liang Chu; Zhuoran Hou; Wen Sun; Yuanjian Zhang
Journal:  Sensors (Basel)       Date:  2021-12-10       Impact factor: 3.576

3.  Application of an improved naive Bayesian analysis for the identification of air leaks in boreholes in coal mines.

Authors:  Hong-Yu Pan; Sui-Nan He; Tian-Jun Zhang; Shuang Song; Kang Wang
Journal:  Sci Rep       Date:  2022-09-27       Impact factor: 4.996

Review 4.  Recent Applications of Deep Learning Methods on Evolution- and Contact-Based Protein Structure Prediction.

Authors:  Donghyuk Suh; Jai Woo Lee; Sun Choi; Yoonji Lee
Journal:  Int J Mol Sci       Date:  2021-06-02       Impact factor: 5.923

5.  Toward a Unified Sub-symbolic Computational Theory of Cognition.

Authors:  Martin V Butz
Journal:  Front Psychol       Date:  2016-06-21
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.