Literature DB >> 30334781

Neural Machine Translation with Deep Attention.

Biao Zhang, Deyi Xiong, Jinsong Su.   

Abstract

Deepening neural models has been proven very successful in improving the model's capacity when solving complex learning tasks, such as the machine translation task. Previous efforts on deep neural machine translation mainly focus on the encoder and the decoder, while little on the attention mechanism. However, the attention mechanism is of vital importance to induce the translation correspondence between different languages where shallow neural networks are relatively insufficient, especially when the encoder and decoder are deep. In this paper, we propose a deep attention model (DeepAtt). Based on the low-level attention information, DeepAtt is capable of automatically determining what should be passed or suppressed from the corresponding encoder layer so as to make the distributed representation appropriate for high-level attention and translation. We conduct experiments on NIST Chinese-English, WMT English-German, and WMT English-French translation tasks, where, with five attention layers, DeepAtt yields very competitive performance against the state-of-the-art results. We empirically find that with an adequate increase of attention layers, DeepAtt tends to produce more accurate attention weights. An in-depth analysis on the translation of important context words further reveals that DeepAtt significantly improves the faithfulness of system translations.

Entities:  

Year:  2018        PMID: 30334781     DOI: 10.1109/TPAMI.2018.2876404

Source DB:  PubMed          Journal:  IEEE Trans Pattern Anal Mach Intell        ISSN: 0098-5589            Impact factor:   6.226


  4 in total

1.  Frame-wise detection of surgeon stress levels during laparoscopic training using kinematic data.

Authors:  Yi Zheng; Grey Leonard; Herbert Zeh; Ann Majewicz Fey
Journal:  Int J Comput Assist Radiol Surg       Date:  2022-02-12       Impact factor: 2.924

2.  CAP-YOLO: Channel Attention Based Pruning YOLO for Coal Mine Real-Time Intelligent Monitoring.

Authors:  Zhi Xu; Jingzhao Li; Yifan Meng; Xiaoming Zhang
Journal:  Sensors (Basel)       Date:  2022-06-08       Impact factor: 3.847

3.  Beyond the Transformer: A Novel Polynomial Inherent Attention (PIA) Model and Its Great Impact on Neural Machine Translation.

Authors:  Mohammed ELAffendi; Khawlah Alrajhi
Journal:  Comput Intell Neurosci       Date:  2022-09-21

4.  An Attention Mechanism Oriented Hybrid CNN-RNN Deep Learning Architecture of Container Terminal Liner Handling Conditions Prediction.

Authors:  Bin Li; Yuqing He
Journal:  Comput Intell Neurosci       Date:  2021-07-08
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.