首页> 外文会议>Annual meeting of the Association for Computational Linguistics >A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations
【24h】

A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations

机译:一种经常性的神经模型,关注中国隐含话语关系的认识

获取原文

摘要

We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model's ability to selectively focus on the relevant parts of an input sequence.
机译:我们介绍了基于关注的BI-LSTM,用于中文隐含话语关系,并证明建模参数对作为联合序列可以优于单词令人不安的方法。我们的型号来自部分采样方案,在概念上简单,但在中国话语TreeBank上实现了最先进的表现。我们还可视化其注意活动,以说明模型选择性地关注输入序列的相关部分。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号