【24h】

Reconstructing Implicit Knowledge with Language Models

机译:用语言模型重建内隐知识

获取原文

摘要

In this work we propose an approach for generating statements that explicate implicit knowledge connecting sentences in text. We make use of pre-trained language models which we refine by fine-tuning them on specifically prepared corpora that we enriched with implicit information, and by constraining them with relevant concepts and connecting common-sense knowledge paths. Manual and automatic evaluation of the generations shows that by refining language models as proposed, we can generate coherent and grammatically sound sentences that explicate implicit knowledge which connects sentence pairs in texts - on both in-domain and out-of-domain test data.
机译:在这项工作中,我们提出了一种生成语句的方法,用于解释连接文本中句子的隐性知识。我们利用预先训练好的语言模型,通过在专门准备的语料库上微调这些模型,我们用隐含信息丰富这些语料库,并用相关概念约束它们,并连接常识知识路径,从而对这些模型进行优化。人工和自动评估的结果表明,通过改进所提出的语言模型,我们可以在域内和域外测试数据上生成连贯且语法合理的句子,这些句子解释了连接文本中句子对的隐性知识。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号