首页> 外文会议>Applications of Artificial Neural Networks >Large Scale Networks Via Self Organizing Hierarchical Networks
【24h】

Large Scale Networks Via Self Organizing Hierarchical Networks

机译:通过自组织分层网络进行大规模网络

获取原文

摘要

The Cascade Correlation algorithm introduced a sound basis for self-scaling learning of monolithic neural networks, but it suffers from a number of drawbacks. These include degradation of learning speed and quality with the size of the network and the development of deep networks with high fan-in rates to hidden units. The Iterative Atrophy algorithm preserves the good features of Cascade Correlation and eliminates its worst characteristics. In addition, we show that Iterative Atrophy extends naturally to the development of self-scaling hierarchical networks which have advantages in both training and representational efficiency.
机译:级联相关算法对单片神经网络的自我扩大学习引入了一个声音基础,但它受到许多缺点。这些包括利用网络大小的学习速度和质量的降低以及具有高粉丝率的深网络的开发到隐藏的单元。迭代萎缩算法保留了级联相关性的良好特征,并消除了其最差特征。此外,我们表明迭代萎缩自然地扩展到自我缩放的分层网络的发展,具有培训和代表性效率的优点。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号