首页> 外文会议>Annual meeting of the Association for Computational Linguistics >How to Make Context More Useful? An Empirical Study on Context-Aware Neural Conversational Models
【24h】

How to Make Context More Useful? An Empirical Study on Context-Aware Neural Conversational Models

机译:如何使上下文更有用?情境感知神经对话模型的实证研究

获取原文

摘要

Generative conversational systems are attracting increasing attention in natural language processing (NLP). Recently, researchers have noticed the importance of context information in dialog processing, and built various models to utilize context. However, there is no systematic comparison to analyze how to use context effectively. In this paper, we conduct an empirical study to compare various models and investigate the effect of context information in dialog systems. We also propose a variant that explicitly weights context vectors by context-query relevance, outperforming the other baselines.
机译:生成式会话系统在自然语言处理(NLP)中引起了越来越多的关注。最近,研究人员注意到上下文信息在对话处理中的重要性,并建立了各种模型来利用上下文。但是,没有系统的比较来分析如何有效地使用上下文。在本文中,我们进行了一项实证研究,以比较各种模型并研究上下文信息在对话系统中的影响。我们还提出了一种变体,该变体通过上下文查询相关性显式加权上下文向量,胜过其他基线。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号