【24h】

Recurrent Dropout without Memory Loss

机译:经常性辍学而不会丢失内存

获取原文

摘要

This paper presents a novel approach to recurrent neural network (RNN) regularization. Differently from the widely adopted dropout method, which is applied to forward connections of feed-forward architectures or RNNs, we propose to drop neurons directly in recurrent connections in a way that does not cause loss of long-term memory. Our approach is as easy to implement and apply as the regular feed-forward dropout and we demonstrate its effectiveness for Long Short-Term Memory network, the most popular type of RNN cells . Our experiments on three NLP benchmarks show consistent improvements even when combined with conventional feed-forward dropout.
机译:本文提出了一种新的递归神经网络(RNN)正则化方法。与广泛应用于前馈架构或RNN的正向连接的丢弃方法不同,我们建议在递归连接中直接丢弃神经元,其方式不会造成长期记忆的丧失。我们的方法与常规前馈辍学一样易于实施和应用,并且我们证明了其对于长期短期记忆网络(RNN单元最流行的类型)的有效性。即使在与常规前馈辍学结合使用时,我们在三个NLP基准上的实验也显示出一致的改进。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号