...
首页> 外文期刊>Journal of machine learning research >Nearly-tight VC-dimension and Pseudodimension Bounds for Piecewise Linear Neural Networks
【24h】

Nearly-tight VC-dimension and Pseudodimension Bounds for Piecewise Linear Neural Networks

机译:分段线性神经网络的近乎缩小的VC维和PseudoDimusion界限

获取原文
           

摘要

We prove new upper and lower bounds on the VC-dimension of deep neural networks with the ReLU activation function. These bounds are tight for almost the entire range of parameters. Letting $W$ be the number of weights and $L$ be the number of layers, we prove that the VC-dimension is $O(W L log(W))$, and provide examples with VC-dimension $Omega( W L log(W/L) )$. This improves both the previously known upper bounds and lower bounds. In terms of the number $U$ of non-linear units, we prove a tight bound $Theta(W U)$ on the VC-dimension. All of these bounds generalize to arbitrary piecewise linear activation functions, and also hold for the pseudodimensions of these function classes. Combined with previous results, this gives an intriguing range of dependencies of the VC-dimension on depth for networks with different non-linearities: there is no dependence for piecewise-constant, linear dependence for piecewise-linear, and no more than quadratic dependence for general piecewise-polynomial.
机译:通过Relu激活功能,我们在深神经网络的VC维度上证明了新的上限和下限。对于几乎整个参数范围,这些界限都很紧。让$ w $是权重和$ l $的数字数量,我们证明了VC-Dimension是$ O(WL log(w))$,并提供vc-dimension $ oomega的示例( wl log(w / l))$。这改善了先前已知的上限和下限。就非线性单位的数字$ U $而言,我们证明了VC维度的紧密绑定$ Theta(W U)$。所有这些界限都概括为任意分段线性激活功能,并且还适用于这些函数类的伪二维等级。结合先前的结果,这为具有不同非线性的网络的网络提供了对VC维度的依赖范围:没有依赖分段 - 线性的分段常数,线性依赖性,并且不超过二次依赖性一般分段 - 多项式。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号