首页> 外文会议>International Conference on Computational Linguistics >Syntactically Aware Cross-Domain Aspect and Opinion Terms Extraction
【24h】

Syntactically Aware Cross-Domain Aspect and Opinion Terms Extraction

机译:语法意识到跨域方面和意见术语提取

获取原文

摘要

A fundamental task of fine-grained sentiment analysis is aspect and opinion terms extraction. Supervised-learning approaches have shown good results for this task; however, they fail to scale across domains where labeled data is lacking. Non pre-trained unsupervised domain adaptation methods that incorporate external linguistic knowledge have proven effective in transferring aspect and opinion knowledge from a labeled source domain to unlabeled target domains; however, pre-trained transformer-based models like BERT and RoBERTa already exhibit substantial syntactic knowledge. In this paper, we propose a method for incorporating external linguistic information into a self-attention mechanism coupled with the BERT model. This enables leveraging the intrinsic knowledge existing within BERT together with externally introduced syntactic information, to bridge the gap across domains. Finally, we demonstrate enhanced results on three benchmark datasets.
机译:细粒度情绪分析的基本任务是方面和意见术语提取。 监督学习方法对此任务表示了良好的结果; 但是,它们未能跨越缺乏标记数据的域占据域。 未经预先培训的无监督域适应方法,其包含外部语言知识已经证明有效地将标记的源域从标记的源域转移到未标记的目标领域; 然而,像BERT和Roberta这样的基于预训练的变压器的模型已经表现出实质性的句法知识。 在本文中,我们提出了一种将外部语言信息掺入与伯特模型耦合的自我注意机构中的方法。 这使得能够利用与外部引入的句法信息一起存在于伯特内的内在知识,以弥合域域的差距。 最后,我们在三个基准数据集上展示了增强的结果。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号