首页> 外文会议>Annual meeting of the Association for Computational Linguistics >Evaluating Discourse in Structured Text Representations
【24h】

Evaluating Discourse in Structured Text Representations

机译:在结构化文本表示中评估话语

获取原文

摘要

Discourse structure is integral to understanding a text and is helpful in many NLP tasks. Learning latent representations of discourse is an attractive alternative to acquiring expensive labeled discourse data. Liu and Lapata (2018) propose a structured attention mechanism for text classification that derives a tree over a text, akin to an RST discourse tree. We examine this model in detail, and evaluate on additional discourse-relevant tasks and datasets, in order to assess whether the structured attention improves performance on the end task and whether it captures a text's discourse structure. We find the learned latent trees have little to no structure and instead focus on lexical cues; even after obtaining more structured trees with proposed model modifications, the trees are still far from capturing discourse structure when compared to discourse dependency trees from an existing discourse parser. Finally, ablation studies show the structured attention provides little benefit, sometimes even hurting performance.~1
机译:话语结构是理解文本所不可或缺的,并且对许多NLP任务很有帮助。学习话语的潜在表示形式是获取昂贵的带标签话语数据的一种有吸引力的选择。 Liu和Lapata(2018)提出了一种用于文本分类的结构化注意力机制,该机制在文本上派生了一棵树,类似于RST话语树。我们将详细研究该模型,并评估与语篇相关的其他任务和数据集,以评估结构化注意力是否可以提高最终任务的性能以及是否捕获文本的语篇结构。我们发现,学习到的潜在树几乎没有结构,而是专注于词汇线索。即使在通过提议的模型修改获得更多结构化的树之后,与现有的话语解析器中的话语依存树相比,这些树仍然距离捕捉话语结构还很远。最后,消融研究表明,结构化注意力几乎没有好处,有时甚至会损害性能。〜1

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号