首页> 外文会议>Curtin University Technology, Science and Engineering International Conference >Review of second-order optimization techniques in artificial neural networks backpropagation
【24h】

Review of second-order optimization techniques in artificial neural networks backpropagation

机译:人工神经网络二阶优化技术回顾反向译

获取原文

摘要

Second-order optimization technique is the advances of first-order optimization in neural networks. It provides an addition curvature information of an objective function that adaptively estimate the step-length of optimization trajectory in training phase of neural network. With the additional information, it reduces training iteration and achieves fast convergence with less tuning of hyper-parameter. The current improved memory allocation and computing power further motivates machine learning practitioners to revisit the benefits of second-order optimization techniques. This paper covers the review on second-order optimization techniques that involve Hessian calculation for neural network training. It reviews the basic theory of Newton method, quasi-Newton, Gauss-Newton, Levenberg-Marquardt, Approximate Greatest Descent and Hessian-Free optimization. This paper summarizes the feasibility and performance of optimization techniques in deep neural network training. Comments and suggestions are highlighted for second-order optimization techniques in artificial neural network training in term of advantages and limitations.
机译:二阶优化技术是神经网络中一阶优化的进步。它提供了一种目标函数的附加曲率信息,其自适应地估计神经网络训练阶段的优化轨迹的步长。通过附加信息,它会降低培训迭代并通过较少调整的超参数实现快速收敛。目前改善的内存分配和计算能力进一步激励了机器学习从业者来重新审视二阶优化技术的好处。本文涵盖了关于二阶优化技术的审查,这些技术涉及Hessian计算神经网络训练。它审查牛顿方法,准牛顿,高斯 - 牛顿,Levenberg-Marquardt的基本理论,近似最大的下降和无粗糙的无幽灵优化。本文总结了深度神经网络训练中优化技术的可行性和性能。在优势和局限性中突出了人工神经网络培训中的二阶优化技术的评论和建议。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号