首页> 外文会议>IEEE International Conference on Semantic Computing >Improving Tree-LSTM with Tree Attention
【24h】

Improving Tree-LSTM with Tree Attention

机译:重视树的改进Tree-LSTM

获取原文

摘要

In Natural Language Processing (NLP), we often need to extract information from tree topology. Sentence structure can be represented via a dependency tree or a constituency tree structure. For this reason, a variant of LSTMs, named Tree-LSTM, was proposed to work on tree topology. In this paper, we design a generalized attention framework for both dependency and constituency trees by encoding variants of decomposable attention inside a Tree-LSTM cell. We evaluated our models on a semantic relatedness task and achieved notable results compared to Tree-Lstmbased methods with no attention as well as other neural and non-neural methods and good results compared to Tree-Lstmbased methods with attention.
机译:在自然语言处理(NLP)中,我们经常需要从树形拓扑中提取信息。句子结构可以通过依赖树或选区树结构表示。由于这个原因,提出了一种名为Tree-LSTM的LSTM变体,可用于树形拓扑。在本文中,我们通过在Tree-LSTM单元内对可分解注意力的变体进行编码,设计了针对依赖性和选区树的通用注意力框架。我们在语义相关性任务上评估了我们的模型,与没有注意的基于Tree-Lstm的方法以及其他神经和非神经方法相比,取得了显着的结果,与有注意的基于Tree-Lstm的方法相比,我们取得了良好的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号