首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Attention Strategies for Multi-Source Sequence-to-Sequence Learning
【24h】

Attention Strategies for Multi-Source Sequence-to-Sequence Learning

机译:多源序列到序列学习的注意力策略

获取原文

摘要

Modeling attention in neural multi-source sequence-to-sequence learning remains a relatively unexplored area, despite its usefulness in tasks that incorporate multiple source languages or modalities. We propose two novel approaches to combine the outputs of attention mechanisms over each source sequence, flat and hierarchical. We compare the proposed methods with existing techniques and present results of systematic evaluation of those methods on the WMT16 Multimodal Translation and Automatic Post-editing tasks. We show that the proposed methods achieve competitive results on both tasks.
机译:尽管在将多源语言或模式结合在一起的任务中很有用,但在神经多源序列到序列学习中对注意力进行建模仍然是一个相对尚未探索的领域。我们提出了两种新颖的方法来组合每个源序列(平面和分层)上注意机制的输出。我们将所提出的方法与现有技术进行比较,并对WMT16多模式翻译和自动后编辑任务上的这些方法进行系统评估,并给出了结果。我们表明,所提出的方法在两项任务上均取得了竞争性结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号