首页> 外文期刊>Pattern recognition letters >Robust feature selection via l_(2,1)-norm in finite mixture of regression
【24h】

Robust feature selection via l_(2,1)-norm in finite mixture of regression

机译:在回归的有限混合中通过l_(2,1)-范数进行鲁棒的特征选择

获取原文
获取原文并翻译 | 示例
       

摘要

Finite mixture of Gaussian regression (FMR) is a widely-used modeling technique in supervised learning problems. In cases where the number of features is large, feature selection is desirable to enhance model interpretability and to avoid overfitting. In this paper, we propose a robust feature selection method via l(2,1)-norm penalized maximum likelihood estimation (MLE) in FMR, with extension to sparse l(2,1) penalty by combining l(1)-norm with l(2,1)-norm for increasing flexibility. To solve the non-convex and non-smooth problem of (sparse) penalized MLE in FMR, we develop an new EM-based algorithm for numerical optimization, with combination of block coordinate descent and majorizing-mimmization scheme in M-step. We finally apply our method in six simulations and one real dataset to demonstrate its superior performance. (c) 2018 Elsevier B.V. All rights reserved.
机译:高斯回归(FMR)的有限混合是在监督学习问题中广泛使用的建模技术。在特征数量很大的情况下,需要进行特征选择以增强模型的可解释性并避免过度拟合。在本文中,我们提出了一种通过FMR中的l(2,1)-范数惩罚最大似然估计(MLE)提出的鲁棒特征选择方法,并将l(1)-范数与l(2,1)-范数可提高灵活性。为了解决FMR中(稀疏)惩罚MLE的非凸和非光滑问题,我们开发了一种新的基于EM的数值优化算法,该算法将M步中的块坐标下降与主化最小化方案相结合。最后,我们将我们的方法应用于6个仿真和1个真实数据集中,以证明其出色的性能。 (c)2018 Elsevier B.V.保留所有权利。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号