首页> 外文会议>Conference on Uncertainty in Artificial Intelligence >Importance Sampled Stochastic Optimization for Variational Inference
【24h】

Importance Sampled Stochastic Optimization for Variational Inference

机译:变形推断重点采样随机优化

获取原文

摘要

Variational inference approximates the posterior distribution of a probabilistic model with a parameterized density by maximizing a lower bound for the model evidence. Modern solutions fit a flexible approximation with stochastic gradient descent, using Monte Carlo approximation for the gradients. This enables variational inference for arbitrary differentiable probabilistic models, and consequently makes variational inference feasible for probabilistic programming languages. In this work we develop more efficient inference algorithms for the task by considering importance sampling estimates for the gradients. We show how the gradient with respect to the approximation parameters can often be evaluated efficiently without needing to re-compute gradients of the model itself, and then proceed to derive practical algorithms that use importance sampled estimates to speed up computation. We present importance sampled stochastic gradient descent that outperforms standard stochastic gradient descent by a clear margin for a range of models, and provide a justifiable variant of stochastic average gradients for variational inference.
机译:变差推断通过最大化模型证据的下限来逼近具有参数化密度的概率模型的后部分布。现代解决方案适用于随机梯度下降的灵活近似,使用镜头近似为梯度。这使得可以对任意可分辨率的概率模型进行变化推断,因此对概率编程语言进行了可行的变分推理。在这项工作中,我们通过考虑梯度的重要性采样估计,我们为任务开发更有效的推理算法。我们示出了如何在不需要重新计算模型本身的梯度的情况下如何评估梯度参数的梯度如何进行评估,然后继续使用重要性采样估计来加速计算的实用算法。我们以一系列模型的透明边距显示出标准随机梯度下降的重要性采样随机梯度下降,并为变分推理提供了随机平均梯度的正当变体。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号