首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Bounds on the Approximation Power of Feedforward Neural Networks
【24h】

Bounds on the Approximation Power of Feedforward Neural Networks

机译:前馈神经网络的逼近度界

获取原文
       

摘要

The approximation power of general feedforward neural networks with piecewise linear activation functions is investigated. First, lower bounds on the size of a network are established in terms of the approximation error and network depth and width. These bounds improve upon state-of-the-art bounds for certain classes of functions, such as strongly convex functions. Second, an upper bound is established on the difference of two neural networks with identical weights but different activation functions.
机译:研究了具有分段线性激活函数的一般前馈神经网络的逼近能力。首先,根据近似误差以及网络深度和宽度确定网络大小的下限。对于某些类的函数(例如强凸函数),这些边界改进了现有技术的边界。其次,在权重相同但激活函数不同的两个神经网络的差异上建立一个上限。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号