首页> 外文会议>Applications of Artificial Neural Networks >Function approximation using a sinc neural network
【24h】

Function approximation using a sinc neural network

机译:使用SINC神经网络的功能近似

获取原文

摘要

Neural networks for function approximation are the basis of many applications. Such networks often use a sigmoidal activation function (e.g. tanh) or a radial basis function (e.g. gaussian). Networks have also been developed using wavelets. In this paper, we present a neural network approximation of functions of a single variable, using sinc functions for the activation functions on the hidden units. Performance of the sinc network is compared with that of a tanh network with the same number of hidden units. The sinc network generally learns the desired input-output mapping in significantly fewer epochs, and achieves a much lower total error on the testing points. The original sinc network is based on theoretical results for function representation using the Whittaker cardinal function (an infinite series expansion in terms of sinc functions). Enhancements t o the original network include improved transformation of the problem domain onto the network input domain. Further work is in progress to study the use of sinc networks for mappings in higher dimension.
机译:功能近似的神经网络是许多应用的基础。这种网络通常使用符号激活函数(例如TanH)或径向基函数(例如高斯)。还使用小波开发了网络。在本文中,我们介绍了单个变量的功能的神经网络近似,使用SINC功能在隐藏单元上的激活功能。 SIND网络的性能与具有相同数量的隐藏单元的Tanh网络进行比较。 SINC网络通常学习所需的输入 - 输出映射,在显着较少的时期,并且在测试点上实现了更低的总误差。原本慈网络基于使用Whittaker Cardinal功能的功能表示的理论结果(在SINC功能方面的无限系列扩展)。原始网络的增强功能包括改进问题域在网络输入域上的转换。进一步的工作正在进行中,以研究使用SINC网络在更高维度中使用inappings。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号