首页> 外文会议>Annual conference on Neural Information Processing Systems >Multivariate Dyadic Regression Trees for Sparse Learning Problems
【24h】

Multivariate Dyadic Regression Trees for Sparse Learning Problems

机译:稀疏学习问题的多元二元回归树

获取原文

摘要

We propose a new nonparametric learning method based on multivariate dyadic regression trees (MDRTs). Unlike traditional dyadic decision trees (DDTs) or classification and regression trees (CARTs), MDRTs are constructed using penalized empirical risk minimization with a novel sparsity-inducing penalty. Theoretically, we show that MDRTs can simultaneously adapt to the unknown sparsity and smoothness of the true regression functions, and achieve the nearly optimal rates of convergence (in a minimax sense) for the class of (α, C)-smooth functions. Empirically, MDRTs can simultaneously conduct function estimation and variable selection in high dimensions. To make MDRTs applicable for large-scale learning problems, we propose a greedy heuristics. The superior performance of MDRTs are demonstrated on both synthetic and real datasets.
机译:我们提出了一种基于多元二元回归树(MDRT)的新非参数学习方法。与传统的二元决策树(DDTS)或分类和回归树(推车)不同,使用新的稀疏性诱导惩罚,使用惩罚经验风险最小化构建MDRT。从理论上讲,MDRTS可以同时适应真实回归函数的未知稀疏性和平滑度,并实现(α,C)-Smooth功能类的近乎最佳的收敛速度(在最少的声明中)。经验上,MDRTS可以同时对高维进行功能估计和变量选择。为了使MDRT适用于大规模学习问题,我们提出了一种贪婪的启发式。在合成和实际数据集上都证明了MDRTS的卓越性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号