首页> 外文会议>International conference on intelligent text processing and computational linguistics >Bayesian Finite Mixture Models for Probabilistic Context-Free Grammars
【24h】

Bayesian Finite Mixture Models for Probabilistic Context-Free Grammars

机译:概率上下文无关文法的贝叶斯有限混合模型

获取原文

摘要

Instead of using a common PCFG to parse all texts, we present an efficient generative probabilistic model for the probabilistic context-free gram-mars(PCFGs) based on the Bayesian finite mixture model, where we assume that there are several PCFGs and each of these PCFGs share the same CFG but with different rule probabilities. Sentences of the same article in the corpus are generated from a common multinomial distribution over these PCFGs. We derive a Markov chain Monte Carlo algorithm for this model. In the experiments, our multi-grammar model outperforms both single grammar model and Inside-Outside algorithm.
机译:我们没有使用通用的PCFG来解析所有文本,而是基于贝叶斯有限混合模型为概率上下文无关文法(PCFG)提出了一种有效的生成概率模型,其中我们假设有多个PCFG,并且每个PCFG共享相同的CFG,但规则概率不同。语料库中同一文章的句子是根据这些PCFG上的常见多项式分布生成的。我们推导了该模型的马尔可夫链蒙特卡罗算法。在实验中,我们的多语法模型优于单一语法模型和“由内而外”算法。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号