...
首页> 外文期刊>Neurocomputing >Rademacher dropout: An adaptive dropout for deep neural network via optimizing generalization gap
【24h】

Rademacher dropout: An adaptive dropout for deep neural network via optimizing generalization gap

机译:Rademacher辍学:通过优化泛化差距来实现深神经网络的自适应辍学

获取原文
获取原文并翻译 | 示例
   

获取外文期刊封面封底 >>

       

摘要

Dropout plays an important role in improving the generalization ability in deep learning. However, the empirical and fixed choice of dropout rates in traditional dropout strategies may increase the generalization gap, which is counter to one of the principle aims of dropout. To handle this problem, in this paper, we propose a novel dropout method. By the theoretical analysis of Dropout Rademacher Complexity, we first prove that the generalization gap of a deep model is bounded by a constraint function related to dropout rates. Meanwhile, we derive a closed form solution via optimizing the constraint function, which is a distribution estimation of dropout rates. Based on the closed form solution, a lightweight complexity algorithm called Rademacher Dropout (RadDropout) is presented to achieve the adaptive adjustment of dropout rates. Moreover, as a verification of the effectiveness of our proposed method, the extensive experimental results on benchmark datasets show that RadDropout achieves improvement of both convergence rate and prediction accuracy. (C) 2019 Elsevier B.V. All rights reserved.
机译:辍学在提高深度学习中的泛化能力方面发挥着重要作用。然而,传统辍学策略中的经验和固定的辍学率选择可能会增加泛化差距,这是辍学原则之一的反应。为了处理这个问题,在本文中,我们提出了一种新颖的辍学方法。通过辍学变速器复杂性的理论分析,我们首先证明了深度模型的泛化间隙被与辍学率相关的约束函数界定。同时,我们通过优化约束函数来派生封闭的表单解决方案,这是辍学率的分布估计。基于封闭式解决方案,提出了一种称为RigeMacher辍学(Raddropout)的轻质复杂性算法,以实现辍学率的自适应调整。此外,作为我们所提出的方法的有效性的验证,基准数据集的广泛实验结果表明,Raddropout实现了收敛速度和预测准确性的提高。 (c)2019 Elsevier B.v.保留所有权利。

著录项

  • 来源
    《Neurocomputing》 |2019年第10期|177-187|共11页
  • 作者单位

    Natl Univ Def Technol Coll Comp State Key Lab High Performance Comp HPCL Changsha Hunan Peoples R China;

    Natl Univ Def Technol Coll Comp State Key Lab High Performance Comp HPCL Changsha Hunan Peoples R China;

    Natl Univ Def Technol Coll Liberal Arts & Sci Changsha Hunan Peoples R China;

    Natl Univ Def Technol Coll Liberal Arts & Sci Changsha Hunan Peoples R China;

    Natl Univ Def Technol Coll Comp State Key Lab High Performance Comp HPCL Changsha Hunan Peoples R China;

    Natl Univ Def Technol Coll Comp State Key Lab High Performance Comp HPCL Changsha Hunan Peoples R China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    Overfitting; Dropout; Rademacher complexity; Generalization gap; Deep neural network;

    机译:过度装备;辍学;Rademacher复杂性;泛化差距;深神经网络;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号