首页> 外文会议>Workshop on representation learning for NLP >Towards Robust Named Entity Recognition for Historic German
【24h】

Towards Robust Named Entity Recognition for Historic German

机译:迈向历史悠久的德国人的强大名为实体认可

获取原文

摘要

Recent advances in language modeling using deep neural networks have shown that these models learn representations, that vary with the network depth from morphology to semantic relationships like co-reference. We apply pre-trained language models to low-resource named entity recognition for Historic German. We show on a series of experiments that character-based pre-trained language models do not run into trouble when faced with low-resource datasets. Our pre-trained character-based language models improve upon classical CRF-based methods and previous work on Bi-LSTMs by boosting F1 score performance by up to 6%. Our pre-trained language and NER models are publicly available~1.
机译:使用深神经网络的语言建模的最新进展已经表明,这些模型学习表示,随着与共同参考的语义关系,网络深度不同。我们将预先训练的语言模型应用于历史悠久的德语的低资源名为实体识别。我们展示了一系列实验,即在面对低资源数据集时,基于字符的预先训练的语言模型不会遇到麻烦。我们预先接受的基于性格的语言模型通过将F1分数性能提升至6%,提高了基于CRF的基于CRF的方法和先前的工作。我们预先接受训练的语言和NER型号是公开可用的〜1。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号