...
首页> 外文期刊>JMLR: Workshop and Conference Proceedings >Proximal Splitting Meets Variance Reduction
【24h】

Proximal Splitting Meets Variance Reduction

机译:近端分割满足方差减少

获取原文
           

摘要

Despite the raise to fame of stochastic variance reduced methods like SAGA and ProxSVRG, their use in non-smooth optimization is still limited to a few simple cases. Existing methods require to compute the proximal operator of the non-smooth term at each iteration, which, for complex penalties like the total variation, overlapping group lasso or trend filtering, is an iterative process that becomes unfeasible for moderately large problems. In this work we propose and analyze VRTOS, a variance-reduced method to solve problems with an arbitrary number of non-smooth terms. Like other variance reduced methods, it only requires to evaluate one gradient per iteration and converges with a constant step size, and so is ideally suited for large scale applications. Unlike existing variance reduced methods, it admits multiple non-smooth terms whose proximal operator only needs to be evaluated once per iteration. We provide a convergence rate analysis for the proposed methods that achieves the same asymptotic rate as their full gradient variants and illustrate its computational advantage on 4 different large scale datasets.
机译:尽管降低随机方差的方法(例如SAGA和ProxSVRG)一举成名,但它们在非平滑优化中的使用仍然仅限于一些简单情况。现有方法需要在每次迭代时计算不平滑项的近端算子,对于总变化,重叠组套索或趋势过滤之类的复杂惩罚,这是一个迭代过程,对于中等大的问题不可行。在这项工作中,我们提出并分析了VRTOS,这是一种减少方差的方法,可以解决任意数量的非光滑项的问题。像其他减少方差的方法一样,每次迭代仅需要评估一个梯度并以恒定的步长收敛,因此非常适合大规模应用。与现有的减少方差的方法不同,它允许多个非平滑项,其近端算子每次迭代仅需要评估一次。我们为所提出的方法提供了收敛速率分析,该方法实现了与它们的完整梯度变体相同的渐近速率,并说明了其在4个不同的大型数据集上的计算优势。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号