首页> 外文会议>International Joint Conference on Neural Networks >New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation
【24h】

New Insights on Learning Rules for Hopfield Networks: Memory and Objective Function Minimisation

机译:Hopfield网络学习规则的新见解:内存和目标函数的最小化

获取原文

摘要

Hopfield neural networks are a possible basis for modelling associative memory in living organisms. After summarising previous studies in the field, we take a new look at learning rules, exhibiting them as descent-type algorithms for various cost functions. We also propose several new cost functions suitable for learning. We discuss the role of biases — the external inputs — in the learning process in Hopfield networks. Furthermore, we apply Newton's method for learning memories, and experimentally compare the performances of various learning rules. Finally, to add to the debate whether allowing connections of a neuron to itself enhances memory capacity, we numerically investigate the effects of self-coupling.
机译:Hopfield神经网络是在活生物体中建立联想记忆的可能基础。在总结了该领域的先前研究之后,我们对学习规则进行了新的研究,将它们展示为针对各种成本函数的下降型算法。我们还提出了一些适合学习的新成本函数。我们讨论了偏差(外部输入)在Hopfield网络学习过程中的作用。此外,我们将牛顿方法应用于记忆学习,并通过实验比较了各种学习规则的表现。最后,为了增加辩论是否允许神经元与其自身的连接增强记忆能力,我们在数字上研究了自耦合的影响。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号