首页> 外文期刊>Neurocomputing >Localized twin SVM via convex minimization
【24h】

Localized twin SVM via convex minimization

机译:通过凸最小化实现局部孪生SVM

获取原文
获取原文并翻译 | 示例
           

摘要

Multisurface proximal support vector machine via generalized eigenvalues (GEPSVM), being an effective classification tool for supervised learning, tries to seek two nonparallel planes that are determined by solving two generalized eigenvalue problems (GEPs). The GEPs may lead to an instable classification performance, due to matrix singularity. Proximal support vector machine using local information (LIPSVM), as a variant of GEPSVM, attempts to avoid the above shortcoming through adopting a similar formulation to the Maximum Margin Criterion (MMC). The solution to an LIPSVM follows directly from solving two standard eigenvalue problems. Actually, an LIPSVM can be viewed as a reduced algorithm, because it uses the selectively generated points to train the classifier. A major advantage of an LIPSVM is that it is resistant to outliers. In this paper, following the geometric intuition of an LIPSVM, a novel multiplane learning approach called Localized Twin SVM via Convex Minimization (LCTSVM) is proposed. This approach determines two nonparallel planes by solving two newly formed SVM-type problems. In addition to keeping the superior characteristics of an LIPSVM, an LCTSVM still has its additional edges: (1) it has similar or better classification capability compared to LIPSVM, TWSVM and LSTSVM; (2) each plane is generated from a quadratic programming problem (QPP) instead of a special convex difference optimization arising from an LIPSVM; (3) the solution can be reduced to solving two systems of linear equations, resulting in considerably lesser computational cost; and (4) it can find the global minimum. Experiments carried out on both toy and real-world problems disclose the effectiveness of an LCTSVM.
机译:通过广义特征值(GEPSVM)的多表面近端支持向量机是一种有效的监督学习分类工具,它试图寻找通过求解两个广义特征值问题(GEP)确定的两个非平行平面。由于矩阵奇异性,GEP可能导致分类性能不稳定。作为GEPSVM的一种变体,使用本地信息(LIPSVM)的近邻支持向量机试图通过采用与最大保证金标准(MMC)类似的公式来避免上述缺点。 LIPSVM的解决方案直接来自于解决两个标准特征值问题。实际上,LIPSVM可以看作是一种简化算法,因为它使用选择性生成的点来训练分类器。 LIPSVM的主要优点是可以抵抗异常值。在本文中,根据LIPSVM的几何直觉,提出了一种新的多平面学习方法,称为通过凸最小化(LCTSVM)进行局部孪生SVM。该方法通过解决两个新形成的SVM型问题来确定两个非平行平面。除了保持LIPSVM的卓越特性外,LCTSVM仍然具有其他优势:(1)与LIPSVM,TWSVM和LSTSVM相比,它具有相似或更好的分类能力; (2)每个平面都是由二次规划问题(QPP)生成的,而不是由LIPSVM产生的特殊凸差优化生成的; (3)可以将解简化为求解两个线性方程组,从而大大降低了计算成本; (4)可以找到全局最小值。针对玩具和现实世界的问题进行的实验揭示了LCTSVM的有效性。

著录项

  • 来源
    《Neurocomputing》 |2011年第4期|p.580-587|共8页
  • 作者单位

    School of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing, China School of Information technology, Nanjing Forestry University, Nanjing, China;

    School of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing, China;

    School of Information technology, Nanjing Forestry University, Nanjing, China;

    School of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing, China;

  • 收录信息 美国《科学引文索引》(SCI);美国《工程索引》(EI);
  • 原文格式 PDF
  • 正文语种 eng
  • 中图分类
  • 关键词

    GEPSVM; Outliers; Quadratic programming; Linear equations; Global minimum;

    机译:GEPSVM;离群值;二次编程;线性方程组;全球最低;

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号