首页> 外文会议>International conference on computational linguistics >Implicit Discourse Relation Recognition with Context-aware Character-enhanced Embeddings
【24h】

Implicit Discourse Relation Recognition with Context-aware Character-enhanced Embeddings

机译:上下文感知字符增强嵌入的隐式话语关系识别

获取原文

摘要

For the task of implicit discourse relation recognition, traditional models utilizing manual features can suffer from data sparsity problem. Neural models provide a solution with distributed representations, which could encode the latent semantic information, and are suitable for recognizing semantic relations between argument pairs. However, conventional vector representations usually adopt embeddings at the word level and cannot well handle the rare word problem without carefully considering morphological information at character level. Moreover, embeddings are assigned to individual words independently, which lacks of the crucial contextual information. This paper proposes a neural model utilizing context-aware character-enhanced embeddings to alleviate the drawbacks of the current word level representation. Our experiments show that the enhanced embeddings work well and the proposed model obtains state-of-the-art results.
机译:对于隐式话语关系识别的任务,利用手动特征的传统模型可能会遇到数据稀疏性问题。神经模型提供了具有分布式表示的解决方案,该解决方案可以对潜在的语义信息进行编码,并且适合于识别参数对之间的语义关系。但是,传统的矢量表示通常在单词级别采用嵌入,并且如果不仔细考虑字符级别的形态信息就无法很好地处理稀有单词问题。而且,嵌入被独立地分配给各个单词,而缺少关键的上下文信息。本文提出了一种利用上下文感知字符增强嵌入的神经模型,以减轻当前单词级别表示的弊端。我们的实验表明,增强的嵌入效果很好,所提出的模型获得了最新的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号