首页> 外文期刊>Expert Systems with Application >Learning to compose over tree structures via POS tags for sentence representation
【24h】

Learning to compose over tree structures via POS tags for sentence representation

机译:通过POS标签学习组成树形结构以表达句子

获取原文
获取原文并翻译 | 示例
           

摘要

Recursive Neural Network (RecNN), a type of model which composes words or phrases recursively over syntactic tree structures, has been proven to have superior ability to obtain sentence representation for a variety of NLP tasks. However, RecNN is born with a thorny problem that a shared compositional function for each node of trees can't capture the complex semantic compositionality so that the expressive power of model is limited. In this paper, in order to address this problem, we propose Tag-Guided Hyper-RecNN/TreeLSTM (TG-HRecNN/TreeLSTM), which introduces hypernetwork into RecNNs to take as inputs Part-of-Speech (POS) tags of word/phrase and generate the semantic composition parameters dynamically. Experimental results on five datasets for two typical NLP tasks show proposed models both obtain significant improvement compared with RecNN and TreeLSTM consistently. Our TG-HTreeLSTM outperforms all existing RecNN-based models and achieves or is competitive with state-of-the-art on four sentence classification benchmarks. The effectiveness of our models is also demonstrated by qualitative analysis. (C) 2019 Elsevier Ltd. All rights reserved.
机译:递归神经网络(RecNN)是一种在语法树结构上递归地构成单词或短语的模型,已被证明具有出色的能力来获取各种NLP任务的句子表示。然而,RecNN带有一个棘手的问题,即树的每个节点共享的组成功能无法捕获复杂的语义组成,因此模型的表达能力受到限制。在本文中,为了解决这个问题,我们提出了标记引导的Hyper-RecNN / TreeLSTM(TG-HRecNN / TreeLSTM),它将超网络引入RecNN中,以作为词/词的词性(POS)标签短语并动态生成语义组成参数。在两个典型的NLP任务的五个数据集上的实验结果表明,与RecNN和TreeLSTM相比,所提出的模型均获得了显着改进。我们的TG-HTreeLSTM优于所有现有的基于RecNN的模型,并在四个句子分类基准上达到或与最新技术相抗衡。定性分析也证明了我们模型的有效性。 (C)2019 Elsevier Ltd.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号