首页> 外文期刊>Computers, Materials & Continua >Dependency-Based Local Attention Approach to Neural Machine Translation
【24h】

Dependency-Based Local Attention Approach to Neural Machine Translation

机译:基于依赖性的神经机翻译的本地注意力方法

获取原文
获取原文并翻译 | 示例
           

摘要

Recently dependency information has been used in different ways to improve neural machine translation. For example, add dependency labels to the hidden states of source words. Or the contiguous information of a source word would be found according to the dependency tree and then be learned independently and be added into Neural Machine Translation (NMT) model as a unit in various ways. However, these works are all limited to the use of dependency information to enrich the hidden states of source words. Since many works in Statistical Machine Translation (SMT) and NMT have proven the validity and potential of using dependency information. We believe that there are still many ways to apply dependency information in the NMT structure. In this paper, we explore a new way to use dependency information to improve NMT. Based on the theory of local attention mechanism, we present Dependency-based Local Attention Approach (DLAA), a new attention mechanism that allowed the NMT model to trace the dependency words related to the current translating words. Our work also indicates that dependency information could help to supervise attention mechanism. Experiment results on WMT 17 Chinese-to-English translation task shared training datasets show that our model is effective and perform distinctively on long sentence translation.
机译:最近的依赖信息已以不同的方式使用来改进神经机翻译。例如,将依赖标签添加到源单词的隐藏状态。或者将根据依赖树确定源单词的连续信息,然后独立地学习,并以各种方式将神经计算机转换(NMT)模型添加到神经计算机转换(NMT)模型中。然而,这些作品均仅限于使用依赖信息来丰富隐藏的源词状态。由于统计机器翻译(SMT)和NMT中的许多工作已经证明了使用依赖信息的有效性和潜力。我们认为仍有许多方法可以在NMT结构中应用依赖信息。在本文中,我们探讨了使用依赖信息来改善NMT的新方法。基于本地关注机制的理论,我们呈现基于依赖性的本地注意力方法(DLAA),一种新的注意机制,允许NMT模型跟踪与当前翻译字相关的依赖词。我们的工作也表明依赖信息可以有助于监督注意机制。 WMT 17中文翻译任务的实验结果共享训练数据集显示我们的模型是有效的,并在长句子翻译中表现得很鲜明。

著录项

  • 来源
    《Computers, Materials & Continua》 |2019年第2期|547-562|共16页
  • 作者单位

    Cyberspace Institute of Advanced Technology (CIAT) Guangzhou University Guangzhou 510006 China;

    Department of Information Science and Engineering Hebei University of Science and Technology Shijiazhuang 050000 China;

    Department of Information Science and Engineering Hebei University of Science and Technology Shijiazhuang 050000 China;

    Department of Information Science and Engineering Hebei University of Science and Technology Shijiazhuang 050000 China;

    Cyberspace Institute of Advanced Technology (CIAT) Guangzhou University Guangzhou 510006 China;

    Cyberspace Institute of Advanced Technology (CIAT) Guangzhou University Guangzhou 510006 China;

    USC Information Sciences Institute Marina del Rey CA 90292 USA;

  • 收录信息
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Neural machine translation; attention mechanism; dependency parsing;

    机译:神经机翻译;注意机制;依赖解析;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号