首页> 外文会议>2011 IEEE International Workshop on Machine Learning for Signal Processing >Bayesian linear regression for Hidden Markov Model based on optimizing variational bounds
【24h】

Bayesian linear regression for Hidden Markov Model based on optimizing variational bounds

机译:基于优化变分界的隐马尔可夫模型贝叶斯线性回归

获取原文

摘要

Linear regression for Hidden Markov Model (HMM) parameters is widely used for the adaptive training of time series pattern analysis especially for speech processing. This paper realizes a fully Bayesian treatment of linear regression for HMMs by using variational techniques. This paper analytically derives the variational lower bound of the marginalized log-likelihood of the linear regression. By using the variational lower bound as an objective function, we can optimize the model topology and hyper-parameters of the linear regression without controlling them as tuning parameters; thus, we realize linear regression for HMM parameters in a non-parametric Bayes manner. Experiments on large vocabulary continuous speech recognition confirm the generalizability of the proposed approach, especially for small quantities of adaptation data.
机译:隐马尔可夫模型(HMM)参数的线性回归被广泛用于时间序列模式分析的自适应训练,尤其是语音处理。本文采用变分技术实现了HMM线性回归的完全贝叶斯处理。本文通过分析得出了线性回归的边际对数似然率的变分下界。通过使用变分下界作为目标函数,我们可以优化线性回归的模型拓扑和超参数,而无需将它们作为调整参数进行控制;因此,我们以非参数贝叶斯方式实现了HMM参数的线性回归。大词汇量连续语音识别的实验证实了该方法的可推广性,特别是对于少量的自适应数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号