首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Analytical Guarantees on Numerical Precision of Deep Neural Networks
【24h】

Analytical Guarantees on Numerical Precision of Deep Neural Networks

机译:深神经网络数字精度的分析保证

获取原文
           

摘要

The acclaimed successes of neural networks often overshadow their tremendous complexity. We focus on numerical precision – a key parameter defining the complexity of neural networks. First, we present theoretical bounds on the accuracy in presence of limited precision. Interestingly, these bounds can be computed via the back-propagation algorithm. Hence, by combining our theoretical analysis and the back-propagation algorithm, we are able to readily determine the minimum precision needed to preserve accuracy without having to resort to time-consuming fixed-point simulations. We provide numerical evidence showing how our approach allows us to maintain high accuracy but with lower complexity than state-of-the-art binary networks.
机译:主要的神经网络成功往往会掩盖他们的巨大复杂性。我们专注于数值精度 - 定义神经网络复杂性的关键参数。首先,我们在有限的精度存在下呈现理论界。有趣的是,可以通过背传播算法计算这些界限。因此,通过组合我们的理论分析和背部传播算法,我们能够容易地确定保持准确性所需的最小精度,而无需追求耗时的定点模拟。我们提供了数字证据,展示了我们的方法如何使我们能够保持高精度,而是比最先进的二进制网络更低的复杂性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号