Papers
Topics
Authors
Recent
Search
2000 character limit reached

Chernoff Information of Bottleneck Gaussian Trees

Published 26 Jan 2016 in cs.IT and math.IT | (1601.06873v1)

Abstract: In this paper, our objective is to find out the determining factors of Chernoff information in distinguishing a set of Gaussian trees. In this set, each tree can be attained via an edge removal and grafting operation from another tree. This is equivalent to asking for the Chernoff information between the most-likely confused, i.e. "bottleneck", Gaussian trees, as shown to be the case in ML estimated Gaussian tree graphs lately. We prove that the Chernoff information between two Gaussian trees related through an edge removal and a grafting operation is the same as that between two three-node Gaussian trees, whose topologies and edge weights are subject to the underlying graph operation. In addition, such Chernoff information is shown to be determined only by the maximum generalized eigenvalue of the two Gaussian covariance matrices. The Chernoff information of scalar Gaussian variables as a result of linear transformation (LT) of the original Gaussian vectors is also uniquely determined by the same maximum generalized eigenvalue. What is even more interesting is that after incorporating the cost of measurements into a normalized Chernoff information, Gaussian variables from LT have larger normalized Chernoff information than the one based on the original Gaussian vectors, as shown in our proved bounds

Citations (6)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.