【24h】

A Method to Boost Naieve Bayesian Classifiers

机译:一种促进贝叶斯分类器的方法

获取原文

摘要

In this paper, we introduce a new method to improve the performance of combining boosting and naieve Bayesian. Instead of combining boosting and Naieve Bayesian learning directly, which was proved to be unsatisfactory to improve performance, we select the training samples dynamically by bootstrap method for the construction of naive Bayesian classifiers, and hence generate very different or unstable base classifiers for boosting. Besides, we devise a modification for the weight adjusting of boosting algorithm in order to achieve this goal: minimizing the overlapping errors of its constituent classifiers. We conducted series of experiments, which show that the new method not only has performance much better than naieve Bayesian classifiers or directly boosted naieve Bayesian ones, but also much quicker to obtain optimal performance than boosting stumps and boosting decision trees incorporated with naieve Bayesian learning.
机译:在本文中,我们介绍了一种提高促进和平贝叶斯的性能的新方法。不是直接结合提升和明智的贝叶斯学习,这被证明是不令人满意的改善性能,而是通过自行启动方法进行动态选择训练样本,以构建天真贝叶斯分类器,因此为升压产生了非常不同或不稳定的基本分类器。此外,我们设计了对升压算法权重调整的修改,以实现这一目标:最小化其组成分类器的重叠误差。我们进行了一系列实验,这表明新方法不仅具有比平性贝叶斯分类器或直接提升的性能,还要更快地获得最佳性能,而不是提高树桩和促进兼容恶劣贝叶斯学习的决策树。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号