首页> 外文会议>International Conference on Machine Learning >Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP
【24h】

Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

机译:自适应特征选择:在RIP下计算上高效的在线稀疏线性回归

获取原文

摘要

Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumptions, this problem is known to be computationally intractable. In this paper, we make the assumption that data matrix satisfies restricted isometry property, and show that this assumption leads to computationally efficient algorithms with sublinear regret for two variants of the problem. In the first variant, the true label is generated according to a sparse linear model with additive Gaussian noise. In the second, the true label is chosen adversarially.
机译:在线稀疏线性回归是一个在线问题,其中算法重复选择坐标子集以观察到在对接的特征向量中,使得实际值预测接收真实标签,并引发平方损失。目标是设计一个在线学习算法,稍微遗憾地将最佳的稀疏线性预测器遗憾地设计。没有任何假设,已知这个问题是计算地难以解决的。在本文中,我们假设数据矩阵满足受限制的isOmry属性,并表明该假设导致计算有效的有效算法,对于解决问题的两个变体的Sublinear遗憾。在第一变体中,根据具有添加性高斯噪声的稀疏线性模型生成真实标签。在第二个中,面前选择真正的标签。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号