...
首页> 外文期刊>Neural processing letters >On the Design of Robust Linear Pattern Classifiers Based on -Estimators
【24h】

On the Design of Robust Linear Pattern Classifiers Based on -Estimators

机译:基于-估计器的鲁棒线性模式分类器设计

获取原文
获取原文并翻译 | 示例
           

摘要

Classical linear neural network architectures, such as the optimal linear associative memory (OLAM) Kohonen and Ruohonen (IEEE Trans Comp 22(7):701-702, 1973) and the adaptive linear element (Adaline) Widrow (IEEE Signal Process Mag 22(1):100-106, 2005; Widrow and Winter (IEEE Comp 21(3):25-39, 1988), are commonly used either as a standalone pattern classifier for linearly separable problems or as a fundamental building block of multilayer nonlinear classifiers, such as the multilayer perceptron (MLP), the radial basis functions networks (RBFN), the extreme learning machine (ELM) (Int J Mach Learn Cyber 2:107-122, 2011) and the echo-state network (ESN) Emmerich (Proceedings of the 20th international conference on artificial neural networks, 148-153, 2010). A common feature shared by the learning equations of OLAM and Adaline, respectively, the ordinary least squares (OLS) and the least mean squares (LMS) algorithms, is that they are optimal only under the assumption of gaussianity of the errors. However, the presence of outliers in the data causes the error distribution to depart from gaussianity and hence the classifier performance deteriorates. Bearing this in mind, in this paper we develop simple and efficient extensions of OLAM and Adaline, named Robust OLAM (ROLAM) and Robust Adaline (Radaline), which are robust to labeling errors (a.k.a. label noise), a type of outlier that often occur in classification tasks. This type of outlier usually results from mistakes during labelling the data points (e.g. misjudgement of a specialist) or from typing errors during creation of data files (e.g. by striking an incorrect key on a keyboard). To deal with such outliers, the ROLAM and the Radaline use -estimators to compute the weights of the OLAM and Adaline networks, instead of using standard OLS/LMS algorithms. By means of comprehensive computer simulations using synthetic and real-world data sets, we show that the proposed robust linear classifiers consistently outperforms their original versions.
机译:经典的线性神经网络架构,例如最优线性联想记忆(OLAM)Kohonen和Ruohonen(IEEE Trans Comp 22(7):701-702,1973)和自适应线性元素(Adaline)Widrow(IEEE Signal Process Mag 22( 1):100-106,2005; Widrow和Winter(IEEE Comp 21(3):25-39,1988)通常用作线性可分离问题的独立模式分类器或多层非线性分类器的基本构建模块例如多层感知器(MLP),径向基函数网络(RBFN),极限学习机(ELM)(Int J Mach Learn Cyber​​ 2:107-122,2011)和回声状态网络(ESN)Emmerich (第20届人工神经网络国际会议论文集,2010年,第148-153页。)OLAM和Adaline的学习方程式分别具有一个共同特征,即普通最小二乘(OLS)和最小均方(LMS)算法是仅在t的高斯假设下才是最优的他错了。但是,数据中存在异常值会导致误差分布偏离高斯性,因此分类器性能会下降。牢记这一点,在本文中,我们开发了OLAM和Adaline的简单有效的扩展,命名为Robust OLAM(ROLAM)和Robust Adaline(Radaline),它们对标记错误(又称标签噪声)具有鲁棒性,这种错误通常是发生在分类任务中。这种类型的异常值通常是由于在标记数据点时出现错误(例如,对专家的错误判断)或在创建数据文件期间发生了键入错误(例如,通过敲击键盘上的错误键)引起的。为了处理这些异常值,ROLAM和Radaline使用-estimators来计算OLAM和Adaline网络的权重,而不是使用标准的OLS / LMS算法。通过使用合成数据和真实数据集进行的全面计算机模拟,我们表明,所提出的鲁棒线性分类器始终优于其原始版本。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号