首页> 外文会议>International Conference on Language Resources and Evaluation >Transfer Learning from Transformers to Fake News Challenge Stance Detection (FNC-1) Task
【24h】

Transfer Learning from Transformers to Fake News Challenge Stance Detection (FNC-1) Task

机译:从变形金刚转移到假新闻挑战姿态检测(FNC-1)任务

获取原文

摘要

Transformer models, trained and publicly released over the lasc couple of years, have proved effective in many NLP tasks. We wished to test their usefulness in particular on the stance detection task. We performed experiments on the data from the Fake News Challenge Stage 1 (FNC-1). We were indeed able to improve the reported SotA on the challenge, by exploiting the generalization power of large language models based on Transformer architecture. Specifically (1) we improved the FNC-1 best performing model adding BERT sentence embedding of input sequences as a model feature, (2) we fine-tuned BERT, XLNet. and RoBERTa transformers on FNC-1 extended dataset and obtained state-of-the-art results on FNC-1 task.
机译:在Lasc在Lasc几年内培训和公开发布的变压器模型已经证明在许多NLP任务中有效。我们希望特别是在立场检测任务上测试他们的实用性。我们对来自假新闻挑战阶段1(FNC-1)的数据进行了实验。通过利用基于变压器架构的大型语言模型的泛化力量,我们确实能够改善报告的SOTA对挑战。具体(1)我们改进了FNC-1最佳性能模型将输入序列的BERT句子嵌入为模型特征,(2)我们微调BERT,XLNET。和FNC-1扩展数据集上的Roberta变形金刚,并在FNC-1任务上获得了最先进的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号