...
首页> 外文期刊>Journal of nonparametric statistics >Model selection consistency of U-statistics with convex loss and weighted lasso penalty
【24h】

Model selection consistency of U-statistics with convex loss and weighted lasso penalty

机译:具有凸损失和加权套索罚金的U统计量模型选择一致性

获取原文
获取原文并翻译 | 示例
           

摘要

In the paper we consider minimisation of U-statistics with the weighted Lasso penalty and investigate their asymptotic properties in model selection and estimation. We prove that the use of appropriate weights in the penalty leads to the procedure that behaves like the oracle that knows the true model in advance, i.e. it is model selection consistent and estimates nonzero parameters with the standard rate. For the unweighted Lasso penalty, we obtain sufficient and necessary conditions for model selection consistency of estimators. The obtained results strongly based on the convexity of the loss function that is the main assumption of the paper. Our theorems can be applied to the ranking problem as well as generalised regression models. Thus, using U-statistics we can study more complex models (better describing real problems) than usually investigated linear or generalised linear models.
机译:在本文中,我们考虑使用加权Lasso罚分最小化U统计量,并研究其在模型选择和估计中的渐近性质。我们证明了在惩罚中使用适当的权重会导致该过程的行为类似于预先知道真实模型的预言家,即模型选择是一致的,并以标准速率估算非零参数。对于未加权的套索惩罚,我们为估计量的模型选择一致性获得了充分必要的条件。获得的结果强烈地基于损失函数的凸性,而凸函数是本文的主要假设。我们的定理可以应用于排名问题以及广义回归模型。因此,与通常研究的线性或广义线性模型相比,使用U统计量,我们可以研究更复杂的模型(更好地描述实际问题)。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号