首页> 外文会议>Second conference on machine translation >Variable Mini-Batch Sizing and Pre-Trained Embeddings
【24h】

Variable Mini-Batch Sizing and Pre-Trained Embeddings

机译:可变小批量大小调整和预训练嵌入

获取原文
获取原文并翻译 | 示例

摘要

This paper describes our submission to the WMT 2017 Neural MT Training Task. We modified the provided NMT system in order to allow for interrupting and continuing the training of models. This allowed mid-training batch size decrementation and incrementation at variable rates. In addition to the models with variable batch size, we tried different setups with pre-trained word2vec embeddings. Aside from batch size incrementation, all our experiments performed below the baseline.
机译:本文介绍了我们提交给WMT 2017神经MT培训任务的材料。我们修改了提供的NMT系统,以允许中断和继续进行模型训练。这样可以使中间训练的批次大小以可变速率递减和递增。除了批量大小可变的模型外,我们还尝试了使用预训练的word2vec嵌入的不同设置。除了批量增加外,我们所有的实验都在基线以下进行。

著录项

  • 来源
  • 会议地点 Copenhagen(DK)
  • 作者单位

    Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics;

    Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics;

    Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics;

  • 会议组织
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号