首页> 外文会议>IEEE International Conference on Machine Learning and Applications >GRAM: Gradient Rescaling Attention Model for Data Uncertainty Estimation in Single Image Super Resolution
【24h】

GRAM: Gradient Rescaling Attention Model for Data Uncertainty Estimation in Single Image Super Resolution

机译:GRAM:用于单图像超分辨率中的数据不确定性估计的梯度重缩放注意模型

获取原文

摘要

In this paper, a new learning method to quantify data uncertainty without suffering from performance degradation in Single Image Super Resolution (SISR) is proposed. Our work is motivated by the fact that the idea of loss design for capturing uncertainty and that for solving SISR are contradictory. As to capturing data uncertainty, we often model the output of a network as a Euclidian distance divided by a predictive variance, negative log-likelihood (NLL) for the Gaussian distribution, so that images with high variance have less impact on training. On the other hand, in the SISR domain, recent works give more weights to the loss of challenging images to improve the performance by using attention models. Nonetheless, the conflict should be handled to make neural networks capable of predicting the uncertainty of a super-resolved image, without suffering from performance degradation. Therefore, we propose a method called Gradient Rescaling Attention Model (GRAM) that combines both attempts effectively. Since variance may reflect the difficulty of an image, we rescale the gradient of NLL by the degree of variance. Hence, the neural network can focus on the challenging images, similarly to attention models. We conduct performance evaluation using standard SISR benchmarks in terms of peak signal-noise ratio (PSNR) and structural similarity (SSIM). The experimental results show that the proposed gradient rescaling method generates negligible performance degradation compared to SISR outputs with the Euclidian loss, whereas NLL without attention degrades the SR quality.
机译:在本文中,提出了一种新的学习方法来量化数据不确定性而不会遭受单图像超分辨率(SISR)性能下降的影响。我们的工作是受到这样一个事实的启发,即用于捕获不确定性的损失设计思想和用于解决SISR的思想是矛盾的。至于捕获数据不确定性,我们通常将网络输出建模为欧几里得距离除以预测方差,高斯分布的负对数似然(NLL),因此具有高方差的图像对训练的影响较小。另一方面,在SISR领域,最近的工作更加重视具有挑战性的图像的丢失,从而通过使用注意力模型来提高性能。尽管如此,应该处理冲突以使神经网络能够预测超分辨图像的不确定性,而不会导致性能下降。因此,我们提出了一种称为“梯度重缩放注意模型”(GRAM)的方法,该方法有效地结合了这两种尝试。由于方差可能反映了图像的难度,因此我们按方差的程度重新缩放NLL的梯度。因此,类似于注意力模型,神经网络可以专注于具有挑战性的图像。我们使用标准SISR基准对峰值信噪比(PSNR)和结构相似度(SSIM)进行性能评估。实验结果表明,与具有欧几里得损失的SISR输出相比,所提出的梯度重缩放方法产生的性能下降可忽略不计,而没有引起注意的NLL会使SR质量下降。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号