首页> 外文会议>Joint workshop on multiword expressions and wordNet;Annual meeting of the Association for Computational Linguistics >Cross-lingual Transfer Learning and Multitask Learning for Capturing Multiword Expressions
【24h】

Cross-lingual Transfer Learning and Multitask Learning for Capturing Multiword Expressions

机译:跨语言迁移学习和多任务学习,以捕获多词表达

获取原文

摘要

Recent developments in deep learning have prompted a surge of interest in the application of multitask and transfer learning to NLP problems. In this study, we explore for the first time, the application of transfer learning (TRL) and multitask learning (MTL) to the identification of Multiword Expressions (MWEs). For MTL, we exploit the shared syntactic information between MWE and dependency parsing models to jointly train a single model on both tasks. We specifically predict two types of labels: MWE and dependency parse. Our neural MTL architecture utilises the supervision of dependency parsing in lower layers and predicts MWE tags in upper layers. In the TRL scenario, we overcome the scarcity of data by learning a model on a larger MWE dataset and transferring the knowledge to a resource-poor setting in another language. In both scenarios, the resulting models achieved higher performance compared to standard neural approaches.
机译:深度学习的最新发展促使人们对将多任务应用和转移学习应用于NLP问题的兴趣激增。在这项研究中,我们首次探索了转移学习(TRL)和多任务学习(MTL)在多词表达(MWE)识别中的应用。对于MTL,我们利用MWE和依赖项解析模型之间的共享语法信息来联合训练这两个任务的单个模型。我们专门预测了两种类型的标签:MWE和依赖项解析。我们的神经MTL体系结构在较低层中利用对依赖项解析的监督,并在较高层中预测MWE标签。在TRL场景中,我们通过在较大的MWE数据集上学习模型并将知识转移到另一种语言的资源匮乏的环境中,从而克服了数据的稀缺性。在两种情况下,与标准神经方法相比,所得模型均具有更高的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号