...
首页> 外文期刊>IEEE Transactions on Signal Processing >MMSE Bounds for Additive Noise Channels Under Kullback–Leibler Divergence Constraints on the Input Distribution
【24h】

MMSE Bounds for Additive Noise Channels Under Kullback–Leibler Divergence Constraints on the Input Distribution

机译:输入分布在Kullback-Leibler发散约束下的附加噪声通道的MMSE界

获取原文
获取原文并翻译 | 示例
           

摘要

Upper and lower bounds on the minimum mean square error for additive noise channels are derived when the input distribution is constrained to be close to a Gaussian reference distribution in terms of the Kullback-Leibler divergence. The upper bound is tight and is attained by a Gaussian distribution whose mean is identical to that of the reference distribution and whose covariance matrix is defined implicitly via a system of non-linear equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound provides an alternative to well-known inequalities in estimation and information theory-such as the Cramr-Rao lower bound, Stams inequality, or the entropy power inequality-that is potentially tighter and defined for a larger class of input distributions. Several examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice.
机译:当根据Kullback-Leibler散度将输入分布限制为接近高斯参考分布时,将得出附加噪声通道的最小均方误差的上限和下限。上限是紧密的,并通过高斯分布获得,其平均值与参考分布的平均值相同,并且其协方差矩阵通过非线性方程组隐式定义。达到上限的估计器被标识为最小极大最优估计器,该估计器可抵抗与假定先验的偏差。下限提供了估计和信息论中众所周知的不等式的替代方法,例如Cramr-Rao下界,Stams不等式或熵幂不等式,它们可能会更严格,并针对较大的输入分布类别进行定义。信号处理和信息理论中的几个应用实例说明了所提出的界限在实践中的有用性。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号