首页> 外文会议>International conference on algorithmic learning theory >Dimension-Adaptive Bounds on Compressive FLD Classification
【24h】

Dimension-Adaptive Bounds on Compressive FLD Classification

机译:压缩FLD分类的尺寸自适应边界

获取原文

摘要

Efficient dimensionality reduction by random projections (RP) gains popularity, hence the learning guarantees achievable in RP spaces are of great interest. In finite dimensional setting, it has been shown for the compressive Fisher Linear Discriminant (FLD) classifier that for good generalisation the required target dimension grows only as the log of the number of classes and is not adversely affected by the number of projected data points. However these bounds depend on the dimensionality d of the original data space. In this paper we give further guarantees that remove d from the bounds under certain conditions of regularity on the data density structure. In particular, if the data density does not fill the ambient space then the error of compressive FLD is independent of the ambient dimension and depends only on a notion of 'intrinsic dimension'.
机译:通过随机投影(RP)进行有效的降维越来越受欢迎,因此,在RP空间中实现的学习保证引起了人们的极大兴趣。在有限维设置中,对于费舍尔线性判别式(FLD)压缩分类器,已显示出,为了获得良好的通用性,所需的目标维仅随着类数的对数增长而增长,并且不受投影数据点数的不利影响。但是,这些界限取决于原始数据空间的维数d。在本文中,我们提供了进一步的保证,即在数据密度结构的某些规则性条件下,将d从边界中删除。特别是,如果数据密度未填充周围空间,则压缩FLD的误差与周围尺寸无关,并且仅取决于“固有尺寸”的概念。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号