首页> 外文会议>Conference on empirical methods in natural language processing >Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction
【24h】

Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

机译:远程监督关系提取的多层次结构化自我注意

获取原文

摘要

Attention mechanisms are often used in deep neural networks for distantly supervised relation extraction (DS-RE) to distinguish valid from noisy instances. However, traditional 1-D vector attention models are insufficient for the learning of different contexts in the selection of valid instances to predict the relationship for an entity pair. To alleviate this issue, we propose a novel multi-level structured (2-D matrix) self-attention mechanism for DS-RE in a multi-instance learning (MIL) framework using bidirectional recurrent neural networks. In the proposed method, a structured word-level self-attention mechanism learns a 2-D matrix where each row vector represents a weight distribution for different aspects of an instance regarding two entities. Targeting the MIL issue, the structured sentence-level attention learns a 2-D matrix where each row vector represents a weight distribution on selection of different valid instances. Experiments conducted on two publicly available DS-RE datasets show that the proposed framework with a multi-level structured self-attention mechanism significantly outperform state-of-the-art baselines in terms of PR curves, P@N and F1 measures.
机译:注意机制通常在深层神经网络中用于远程监督的关系提取(DS-RE),以区分有效实例与嘈杂实例。然而,传统的一维矢量注意力模型不足以在选择有效实例以预测实体对之间的关​​系时学习不同的上下文。为了缓解这个问题,我们为使用双向递归神经网络的多实例学习(MIL)框架中的DS-RE提出了一种新颖的多层次结构化(2-D矩阵)自注意力机制。在提出的方法中,结构化的单词级自我关注机制学习一个二维矩阵,其中每个行向量代表一个实例关于两个实体的不同方面的权重分布。针对MIL问题,结构化的语句级注意学习了一个二维矩阵,其中每个行向量代表选择不同有效实例时的权重分布。在两个可公开获得的DS-RE数据集上进行的实验表明,所提出的具有多层次结构化自我关注机制的框架在PR曲线,P @ N和F1度量方面明显优于最新的基线。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号