首页> 外文会议>Algorithmic learning theory >A Lower Bound for Learning Distributions Generated by Probabilistic Automata
【24h】

A Lower Bound for Learning Distributions Generated by Probabilistic Automata

机译:概率自动机生成的学习分布的下界

获取原文
获取原文并翻译 | 示例

摘要

Known algorithms for learning PDFA can only be shown to run in time polynomial in the so-called distinguishability μ, of the target machine, besides the number of states and the usual accuracy and confidence parameters. We show that the dependence on μ, is necessary for every algorithm whose structure resembles existing ones. As a technical tool, a new variant of Statistical Queries termed Loo-queries is defined. We show how these queries can be simulated from samples and observe that known PAC algorithms for learning PDFA can be rewritten to access its target using Loo-queries and standard Statistical Queries. Finally, we show a lower bound: every algorithm to learn PDFA using queries with a resonable tolerance needs a number of queries larger than (1/μ)~c for every c < 1.
机译:除了状态数和通常的精度和置信度参数外,已知的用于学习PDFA的算法只能显示为以目标机器的所谓可分辨性μ的时间多项式运行。我们证明,对于每个结构类似现有算法的算法,都必须依赖μ。作为一种技术工具,定义了一种新的统计查询变量,称为Loo查询。我们展示了如何从样本中模拟这些查询,并观察到可以使用Loo查询和标准统计查询来重写用于学习PDFA的已知PAC算法,以访问其目标。最后,我们给出一个下界:对于每一个使用c容忍度合理的查询来学习PDFA的算法,每c <1都需要大于(1 /μ)〜c个查询。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号