【24h】

Decremental multi-output least square SVR learning

机译:递减的多输出最小二乘SVR学习

获取原文

摘要

The solution of multi-output LS-SVR machines follows from solving a set of linear equations. Compared with ε-intensive SVR, it loses the advantage of a sparse decomposition. In order to limit the number of support vectors and reduce the computation cost, this paper presents a decremental recursive algorithm for multi-output LS-SVR machines. This algorithm removes one sample one time and large-scale matrix inverse is computed quickly based on previous results. The decremental algorithm can be used to train online multi-output LS-SVR machine. Experimental results demonstrate the effectiveness of the algorithm.
机译:多输出LS-SVR机器的解决方案来自求解一组线性方程式的过程。与ε密集型SVR相比,它失去了稀疏分解的优势。为了限制支持向量的数量并减少计算成本,本文提出了一种用于多输出LS-SVR机器的递减递归算法。该算法一次去除一个样本,并根据先前的结果快速计算大规模矩阵逆。递减算法可用于训练在线多输出LS-SVR机器。实验结果证明了该算法的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号