首页> 外文会议>The 2011 International Joint Conference on Neural Networks >Group lasso regularized multiple kernel learning for heterogeneous feature selection
【24h】

Group lasso regularized multiple kernel learning for heterogeneous feature selection

机译:组套索正则化多核学习用于异构特征选择

获取原文

摘要

We propose a novel multiple kernel learning (MKL) algorithm with a group lasso regularizer, called group lasso regularized MKL (GL-MKL), for heterogeneous feature selection. We extend the existing MKL algorithm and impose a mixed ℓ1 and ℓ2 norm constraint (known as group lasso) as the regularizer. Our GL-MKL determines the optimal base kernels, including the associated weights and kernel parameters, and results in a compact set of features for comparable or improved recognition performance. The use of our GL-MKL avoids the problem of choosing the proper technique to normalize the feature attributes collected from heterogeneous domains (and thus with different properties and distribution ranges). Our approach does not need to exhaustively search for the entire feature space when performing feature selection like prior sequential-based feature selection methods did, and we do not require any prior knowledge on the optimal size of the feature subset either. Comparisons with existing MKL or sequential-based feature selection methods on a variety of datasets confirm the effectiveness of our method in selecting a compact feature subset for comparable or improved classification performance.
机译:我们提出了一种新的多核学习(MKL)算法,该算法具有称为套索正则化MKL(GL-MKL)的套索正则化器,用于异构特征选择。我们扩展了现有的MKL算法,并强加了ℓ 1 和ℓ 2 范数约束(称为组套索)作为正则化器。我们的GL-MKL可以确定最佳的基本内核,包括相关的权重和内核参数,并可以提供一组紧凑的功能,以实现可比的或改进的识别性能。使用我们的GL-MKL避免了选择适当的技术来规范化从异构域收集的特征属性(因此具有不同的属性和分布范围)的问题。当执行特征选择时,我们的方法不需要像以前的基于顺序的特征选择方法那样详尽地搜索整个特征空间,并且我们也不需要任何有关特征子集最佳大小的先验知识。在各种数据集上与现有MKL或基于序列的特征选择方法进行比较,证实了我们的方法在选择紧凑特征子集以实现可比或改进的分类性能方面的有效性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号