【24h】

Sparse Super-Regular Networks

机译:稀疏的超常规网络

获取原文

摘要

It has been argued by Thom and Palm that sparsely-connected neural networks (SCNs) show improved performance over fully-connected networks (FCNs). Super-regular networks (SRNs) are neural networks composed of a set of stacked sparse layers of (epsilon, delta)-super-regular pairs, and randomly permuted node order. Using the Blow-up Lemma, we prove that as a result of the individual super-regularity of each pair of layers, SRNs guarantee a number of properties that make them suitable replacements for FCNs for many tasks. These guarantees include edge uniformity across all large-enough subsets, minimum node in-and out-degree, input-output sensitivity, and the ability to embed pre-trained constructs. Indeed, SRNs have the capacity to act like FCNs, and eliminate the need for costly regularization schemes like Dropout. We show that SRNs perform similarly to X-Nets via readily reproducible experiments, and offer far greater guarantees and control over network structure.
机译:它是由Thom和Palm所说的,稀疏连接的神经网络(SCNS)显示出完全连接的网络(FCN)的改进性能。超级常规网络(SRNS)是由一组堆叠稀疏层组成的神经网络(epsilon,delta) - 常规对和随机置换的节点顺序组成。使用爆破引理,我们证明,由于每对层的各个超规则性,SRNS保证了许多属性,使它们适用于FCN的许多任务。这些保证包括跨所有大足够大的子集,最小节点内和OUT度,输入 - 输出灵敏度以及嵌入预先训练的构造的能力的边缘均匀性。实际上,SRNS有能力像FCN一样行动,并消除了对耗时的昂贵正则化方案的需求。我们表明SRNS通过易于可重复的实验同样地表现为X-Net,并提供更大的保证和对网络结构的控制。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号