首页> 外文会议>Annual Conference on Neural Information Processing Systems >Progressive mixture rules are deviation suboptimal
【24h】

Progressive mixture rules are deviation suboptimal

机译:渐进式混合规则是偏差次优

获取原文

摘要

We consider the learning task consisting in predicting as well as the best function in a finite reference set g up to the smallest possible additive term. If R(g) denotes the generalization error of a prediction function g, under reasonable assumptions on the loss function (typically satisfied by the least square loss when the output is bounded), it is known that the progressive mixture rule g satisfies ER(g) ≤ min_(g∈g) R(g) + Cst (log|g|)/n, (1) where n denotes the size of the training set, and E denotes the expectation w.r.t. the training set distribution.This work shows that, surprisingly, for appropriate reference sets g, the deviation convergence rate of the progressive mixture rule is no better than Cst/n~(1/2): it fails to achieve the expected Cst/n. We also provide an algorithm which does not suffer from this drawback, and which is optimal in both deviation and expectation convergence rates.
机译:我们考虑在预测中组成的学习任务以及有限参考集G的最佳功能,最高可达可能的添加项。如果R(g)表示预测函数G的泛化误差,则在损耗函数的合理假设下(当输出界限时最小的方形损耗通常满足时),已知逐行混合规则G满足ER(g )≤min_(g∈g)r(g)+ cst(log | g |)/ n,(1)其中n表示训练集的大小,并且E表示期望WRT训练集发行。这个工作表明,令人惊讶的是,对于适当的参考集G,渐进式混合规则的偏差会聚率不得优于CST / N〜(1/2):它未能达到预期的CST / N 。我们还提供了一种不受此缺点的算法,并且在偏差和期望收敛速率方面是最佳的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号