首页> 外文会议>3rd European symposium on artificial neural networks >A new training algorithm for feedforward neural networks
【24h】

A new training algorithm for feedforward neural networks

机译:前馈神经网络的新训练算法

获取原文
获取原文并翻译 | 示例

摘要

This paper introduces an approach for fast training of the Feedforward Neural Networks (FNNs). The approach is based on linearization of nonlinear output activation function by inverting it and calculation of weights using gradient-decent and QR decomposition techniques. The approach called Gradient-descent Orthogonalized Training (GOT) algorithm. The algorithm GOT is applied to some benchmark problems and the results are compared to those obtained using Error Back-Propagation (EBP) algorithm.
机译:本文介绍了一种快速训练前馈神经网络(FNN)的方法。该方法基于非线性输出激活函数的线性化,方法是将其反相,然后使用梯度下降和QR分解技术计算权重。该方法称为梯度下降正交训练(GOT)算法。将GOT算法应用于某些基准问题,并将结果与​​使用误差反向传播(EBP)算法获得的结果进行比较。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号