【24h】

IKNN: Informative K-Nearest Neighbor Pattern Classification

机译:IKNN:信息性K最近邻模式分类

获取原文
获取原文并翻译 | 示例

摘要

The K-nearest neighbor (KNN) decision rule has been a ubiquitous classification tool with good scalability. Past experience has shown that the optimal choice of K depends upon the data, making it laborious to tune the parameter for different applications. We introduce a new metric that measures the informativeness of objects to be classified. When applied as a query-based distance metric to measure the closeness between objects, two novel KNN procedures, Locally Informative-KNN (LI-KNN) and Globally Informative-KNN (GI-KNN), are proposed. By selecting a subset of most informative objects from neighborhoods, our methods exhibit stability to the change of input parameters, number of neighbors(K) and informative points (I). Experiments on UCI benchmark data and diverse real-world data sets indicate that our approaches are application-independent and can generally outperform several popular KNN extensions, as well as SVM and Boosting methods.
机译:K最近邻(KNN)决策规则已成为具有良好可伸缩性的无处不在的分类工具。过去的经验表明,K的最佳选择取决于数据,这使得为不同应用调整参数变得很费力。我们引入了一种新的度量标准,用于度量要分类的对象的信息性。当将其作为基于查询的距离度量来测量对象之间的接近度时,提出了两种新颖的KNN程序:局部信息KNN(LI-KNN)和全局信息KNN(GI-KNN)。通过从邻域中选择信息量最大的对象的子集,我们的方法表现出对输入参数,邻居数(K)和信息点(I)的变化的稳定性。在UCI基准数据和各种实际数据集上进行的实验表明,我们的方法与应用程序无关,并且通常可以胜过几种流行的KNN扩展,SVM和Boosting方法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号