Papers
Topics
Authors
Recent
Search
2000 character limit reached

Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution

Published 26 Apr 2018 in math.ST and stat.TH | (1804.10151v1)

Abstract: Tight bounds on the minimum mean square error for the additive Gaussian noise channel are derived, when the input distribution is constrained to be epsilon-close to a Gaussian reference distribution in terms of the Kullback--Leibler divergence. The distributions that attain the bounds are shown be Gaussian whose means are identical to that of the reference distribution and whose covariance matrices are defined implicitly via systems of matrix equations. The estimator that attains the upper bound is identified as a minimax optimal estimator that is robust against deviations from the assumed prior. The lower bound is shown to provide a potentially tighter alternative to the Cramer--Rao bound. Both properties are illustrated with numerical examples.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.