首页> 外文会议>Conference on the North American Chapter of the Association for Computational Linguistics: Human Language Technologies >A Richer-but-Smarter Shortest Dependency Path with Attentive Augmentation for Relation Extraction
【24h】

A Richer-but-Smarter Shortest Dependency Path with Attentive Augmentation for Relation Extraction

机译:一种更丰富的但更智能的最短依赖路径,具有关注增强的关系提取

获取原文

摘要

To extract the relationship between two entities in a sentence, two common approaches are (1) using their shortest dependency path (SDP) and (2) using an attention model to capture a context-based representation of the sentence. Each approach suffers from its own disadvantage of either missing or redundant information. In this work, we propose a novel model that combines the advantages of these two approaches. This is based on the basic information in the SDP enhanced with information selected by several attention mechanisms with kernel filters, namely RbSP (Richer-but-Smarter SDP). To exploit the representation behind the RbSP structure effectively, we develop a combined deep neural model with a LSTM network on word sequences and a CNN on RbSP. Experimental results on the SemEval-2010 dataset demonstrate improved performance over competitive baselines.
机译:为了在句子中提取两个实体之间的关系,使用其最短依赖路径(SDP)和(2)使用注意模型来捕获句子的上下文表示的两个常见方法。每种方法都遭受其自身缺失或冗余信息的缺点。在这项工作中,我们提出了一种新型模型,结合了这两种方法的优势。这是基于SDP中的基本信息增强,其中包含多个注意机制选择的信息,即RBSP(更丰富,但更智能的SDP)。为了有效地利用RBSP结构背后的表示,我们在Word序列上的LSTM网络和RBSP上的CNN开发了一个组合的深度神经网络。 Semeval-2010数据集上的实验结果表明,对竞争基线的性能提高。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号