首页> 外文期刊>Neurocomputing >A regularized estimation framework for online sparse LSSVR models
【24h】

A regularized estimation framework for online sparse LSSVR models

机译:在线稀疏LSSVR模型的正则估计框架

获取原文
获取原文并翻译 | 示例
           

摘要

Aiming at machine learning applications in which fast online learning is required, we develop a variant of the Least Squares SVR (LSSVR) model that can learn incrementally from data and eventually provide a sparse solution vector. This is possible by incorporating into the LSSVR model the sparsification mechanism used by the kernel RLS (KRLS) model introduced in Engel et al., 2004. The performance of the resulting model, henceforth referred to as the online sparse LSSVR (OS-LSSVR) model, is comprehensively evaluated by computer experiments on several benchmarking datasets (including a large scale one) covering a number of challenging tasks in nonlinear time series prediction and system identification. Convergence, efficiency and error bounds of the OS-LSSVR model are also addressed. The results indicate that the proposed approach consistently outperforms the state of the art in kernel adaptive filtering algorithms, by providing more sparse solutions with smaller prediction errors and smaller norms for the solution vector. (C) 2017 Elsevier B.V. All rights reserved.
机译:针对需要快速在线学习的机器学习应用程序,我们开发了最小二乘SVR(LSSVR)模型的变体,该模型可以从数据中逐步学习,并最终提供稀疏的解决方案向量。通过将Engel等人(2004年)引入的内核RLS(KRLS)模型所使用的稀疏化机制合并到LSSVR模型中,可以实现这一目标。该模型通过计算机实验在几个基准数据集(包括大规模数据集)上进行了综合评估,涵盖了非线性时间序列预测和系统识别中的许多艰巨任务。还讨论了OS-LSSVR模型的收敛性,效率和错误范围。结果表明,所提出的方法在内核自适应滤波算法中始终优于现有技术,其方法是提供更多的稀疏解决方案,并具有较小的预测误差和较小的规范向量。 (C)2017 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号