【24h】

Evidence-based mixture of MLP-experts

机译:基于证据的MLP专家组合

获取原文

摘要

Mixture of Experts (ME) is a modular neural network architecture for supervised learning. In this paper, we propose an evidence-based ME to deal with the classification problem. In the basic form of ME the problem space is automatically divided into several subspaces for the experts and the outputs of experts are combined by a gating network. Satisfactory performance of the basic ME depends on the diversity among experts. In conventional ME, different initialization of experts and supervision of the gating network during the learning procedure, provide the diversity. The main idea of our proposed method is to employ the Dempster-Shafer (D-S) theory of evidence to improve determination of learning parameters (which results more diverse experts) and the way of combining experts' decisions. Experimental results with some data sets from UCI repository show that our proposed method yields better classification rates as compared to basic ME and static combining of neural network based on D-S theory.
机译:专家混合(ME)是一种用于监督学习的模块化神经网络体系结构。在本文中,我们提出了一种基于证据的ME来处理分类问题。在ME的基本形式中,问题空间会自动分为几个子空间供专家使用,而专家的输出则通过门控网络进行组合。基本ME的令人满意的性能取决于专家之间的差异。在传统的ME中,专家的不同初始化和学习过程中门控网络的监督提供了多样性。我们提出的方法的主要思想是采用证据的Dempster-Shafer(D-S)理论来改进对学习参数的确定(这会导致更多的专家)和组合专家决策的方式。从UCI存储库中获得的一些数据集的实验结果表明,与基本的ME和基于D-S理论的神经网络静态组合相比,我们提出的方法具有更好的分类率。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号