The search functionality is under construction.

IEICE TRANSACTIONS on Information

Document-Level Neural Machine Translation with Associated Memory Network

Shu JIANG, Rui WANG, Zuchao LI, Masao UTIYAMA, Kehai CHEN, Eiichiro SUMITA, Hai ZHAO, Bao-liang LU

  • Full Text Views

    0

  • Cite this

Summary :

Standard neural machine translation (NMT) is on the assumption that the document-level context is independent. Most existing document-level NMT approaches are satisfied with a smattering sense of global document-level information, while this work focuses on exploiting detailed document-level context in terms of a memory network. The capacity of the memory network that detecting the most relevant part of the current sentence from memory renders a natural solution to model the rich document-level context. In this work, the proposed document-aware memory network is implemented to enhance the Transformer NMT baseline. Experiments on several tasks show that the proposed method significantly improves the NMT performance over strong Transformer baselines and other related studies.

Publication
IEICE TRANSACTIONS on Information Vol.E104-D No.10 pp.1712-1723
Publication Date
2021/10/01
Publicized
2021/06/24
Online ISSN
1745-1361
DOI
10.1587/transinf.2020EDP7244
Type of Manuscript
PAPER
Category
Natural Language Processing

Authors

Shu JIANG
  Shanghai Jiao Tong University
Rui WANG
  Shanghai Jiao Tong University
Zuchao LI
  Shanghai Jiao Tong University
Masao UTIYAMA
  National Institute of Information and Communications Technology
Kehai CHEN
  National Institute of Information and Communications Technology
Eiichiro SUMITA
  National Institute of Information and Communications Technology
Hai ZHAO
  Shanghai Jiao Tong University
Bao-liang LU
  Shanghai Jiao Tong University

Keyword