首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP
【24h】

Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

机译:自适应特征选择:RIP下计算有效的在线稀疏线性回归

获取原文
           

摘要

Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumptions, this problem is known to be computationally intractable. In this paper, we make the assumption that data matrix satisfies restricted isometry property, and show that this assumption leads to computationally efficient algorithms with sublinear regret for two variants of the problem. In the first variant, the true label is generated according to a sparse linear model with additive Gaussian noise. In the second, the true label is chosen adversarially.
机译:在线稀疏线性回归是一个在线问题,在该问题中,算法会反复选择一个坐标子集以在对抗性选择的特征向量中进行观察,进行实值预测,接收真实标签,并引起平方损失。目标是设计一种对事后见识的最佳稀疏线性预测器具有次线性后悔的在线学习算法。没有任何假设,已知此问题在计算上是棘手的。在本文中,我们假设数据矩阵满足受限的等距特性,并表明该假设导致了问题的两个变体具有次线性遗憾的计算有效算法。在第一种变体中,真实标签是根据具有加性高斯噪声的稀疏线性模型生成的。在第二个中,真实标签是对抗性选择的。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号