首页> 外文期刊>IEEE Transactions on Pattern Analysis and Machine Intelligence >Genetic-based EM algorithm for learning Gaussian mixture models
【24h】

Genetic-based EM algorithm for learning Gaussian mixture models

机译:基于遗传的EM算法学习高斯混合模型

获取原文
获取原文并翻译 | 示例
           

摘要

We propose a genetic-based expectation-maximization (GA-EM) algorithm for learning Gaussian mixture models from multivariate data. This algorithm is capable of selecting the number of components of the model using the minimum description length (MDL) criterion. Our approach benefits from the properties of genetic algorithms (GA) and the EM algorithm by combination of both into a single procedure. The population-based stochastic search of the GA explores the search space more thoroughly than the EM method. Therefore, our algorithm enables escaping from local optimal solutions since the algorithm becomes less sensitive to its initialization. The GA-EM algorithm is elitist which maintains the monotonic convergence property of the EM algorithm. The experiments on simulated and real data show that the GA-EM outperforms the EM method since: (1) we have obtained a better MDL score while using exactly the same termination condition for both algorithms; (2) our approach identifies the number of components which were used to generate the underlying data more often than the EM algorithm.
机译:我们提出了一种基于遗传的期望最大化(GA-EM)算法,用于从多变量数据中学习高斯混合模型。该算法能够使用最小描述长度(MDL)标准选择模型的组件数量。我们的方法通过将遗传算法(GA)和EM算法的特性组合到一个过程中而受益。 GA的基于种群的随机搜索比EM方法更全面地探索搜索空间。因此,由于该算法对其初始化变得不那么敏感,因此我们的算法能够摆脱局部最优解。 GA-EM算法是精英主义者,保持了EM算法的单调收敛性。在模拟和真实数据上的实验表明,GA-EM优于EM方法,因为:(1)在两种算法使用完全相同的终止条件的同时,我们获得了更好的MDL分数; (2)我们的方法确定了比EM算法更频繁地用于生成基础数据的组件数量。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号