Literature DB >> 30764742

Gated Orthogonal Recurrent Units: On Learning to Forget.

Li Jing1, Caglar Gulcehre2, John Peurifoy3, Yichen Shen4, Max Tegmark5, Marin Soljacic6, Yoshua Bengio7.   

Abstract

We present a novel recurrent neural network (RNN)-based model that combines the remembering ability of unitary evolution RNNs with the ability of gated RNNs to effectively forget redundant or irrelevant information in its memory. We achieve this by extending restricted orthogonal evolution RNNs with a gating mechanism similar to gated recurrent unit RNNs with a reset gate and an update gate. Our model is able to outperform long short-term memory, gated recurrent units, and vanilla unitary or orthogonal RNNs on several long-term-dependency benchmark tasks. We empirically show that both orthogonal and unitary RNNs lack the ability to forget. This ability plays an important role in RNNs. We provide competitive results along with an analysis of our model on many natural sequential tasks, including question answering, speech spectrum prediction, character-level language modeling, and synthetic tasks that involve long-term dependencies such as algorithmic, denoising, and copying tasks.

Mesh:

Year:  2019        PMID: 30764742     DOI: 10.1162/neco_a_01174

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  2 in total

1.  A bio-inspired bistable recurrent cell allows for long-lasting memory.

Authors:  Nicolas Vecoven; Damien Ernst; Guillaume Drion
Journal:  PLoS One       Date:  2021-06-08       Impact factor: 3.240

2.  Predicting MHC I restricted T cell epitopes in mice with NAP-CNB, a novel online tool.

Authors:  Carlos Wert-Carvajal; Rubén Sánchez-García; José R Macías; Rebeca Sanz-Pamplona; Almudena Méndez Pérez; Ramon Alemany; Esteban Veiga; Carlos Óscar S Sorzano; Arrate Muñoz-Barrutia
Journal:  Sci Rep       Date:  2021-05-24       Impact factor: 4.379

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.