【24h】

Generalized Softmax Networks for Non-linear Component Extraction

机译:用于非线性分量提取的广义Softmax网络

获取原文
获取原文并翻译 | 示例

摘要

We develop a probabilistic interpretation of non-linear component extraction in neural networks that activate their hidden units according to a softmax-like mechanism. On the basis of a generative model that combines hidden causes using the max-function, we show how the extraction of input components in such networks can be interpreted as maximum likelihood parameter optimization. A simple and neurally plausible Hebbian △-rule is derived. For approximately-optimal learning, the activity of the hidden neural units is described by a generalized softmax function and the classical softmax is recovered for very sparse input. We use the bars benchmark test to numerically verify our analytical results and to show competitiveness of the derived learning algorithms.
机译:我们开发了一种神经网络中非线性分量提取的概率解释,该神经网络根据类似softmax的机制激活其隐藏单元。在生成模型的基础上,使用最大函数组合了隐藏的原因,我们展示了如何将此类网络中输入分量的提取解释为最大似然参数优化。得出一个简单且在神经上似乎合理的Hebbian△规则。对于近似最佳学习,隐藏的神经单元的活动由广义的softmax函数描述,经典的softmax被恢复用于非常稀疏的输入。我们使用条形基准测试对我们的分析结果进行数值验证,并证明衍生学习算法的竞争力。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号