首页> 美国卫生研究院文献>Computational Intelligence and Neuroscience >Efficient Multiple Kernel Learning Algorithms Using Low-Rank Representation
【2h】

Efficient Multiple Kernel Learning Algorithms Using Low-Rank Representation

机译:使用低秩表示的高效多核学习算法

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

Unlike Support Vector Machine (SVM), Multiple Kernel Learning (MKL) allows datasets to be free to choose the useful kernels based on their distribution characteristics rather than a precise one. It has been shown in the literature that MKL holds superior recognition accuracy compared with SVM, however, at the expense of time consuming computations. This creates analytical and computational difficulties in solving MKL algorithms. To overcome this issue, we first develop a novel kernel approximation approach for MKL and then propose an efficient Low-Rank MKL (LR-MKL) algorithm by using the Low-Rank Representation (LRR). It is well-acknowledged that LRR can reduce dimension while retaining the data features under a global low-rank constraint. Furthermore, we redesign the binary-class MKL as the multiclass MKL based on pairwise strategy. Finally, the recognition effect and efficiency of LR-MKL are verified on the datasets Yale, ORL, LSVT, and Digit. Experimental results show that the proposed LR-MKL algorithm is an efficient kernel weights allocation method in MKL and boosts the performance of MKL largely.
机译:与支持向量机(SVM)不同,多核学习(MKL)允许数据集根据其分布特征而不是精确的特征自由选择有用的内核。在文献中已经表明,与SVM相比,MKL具有更高的识别精度,但是却要花费大量的计算时间。这在解决MKL算法时产生了分析和计算困难。为了克服这个问题,我们首先开发了一种新颖的MKL核逼近方法,然后通过使用低秩表示(LRR)提出了一种有效的低秩MKL(LR-MKL)算法。众所周知,LRR可以减小维度,同时在全局低秩约束下保留数据特征。此外,我们基于成对策略将二进制类MKL重新设计为多类MKL。最后,在数据集Yale,ORL,LSVT和Digit上验证了LR-MKL的识别效果和效率。实验结果表明,所提出的LR-MKL算法是一种有效的MKL核权重分配方法,极大地提高了MKL的性能。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号