Dispersion Bound for the Wyner-Ahlswede-Körner Network via Reverse Hypercontractivity on Types
Abstract: This paper introduces a new converse machinery for a challenging class of distributed source-type problems (e.g.\ distributed source coding, common randomness generation, or hypothesis testing with communication constraints), through the example of the Wyner-Ahlswede-K\"orner network. Using the functional-entropic duality and the reverse hypercontractivity of the transposition semigroup, we lower bound the error probability for each joint type. Then by averaging the error probability over types, we lower bound the $c$-dispersion (which characterizes the second-order behavior of the weighted sum of the rates of the two compressors when a nonvanishing error probability is small) as the variance of the gradient of $\inf_{P_{U|X}}{cH(Y|U)+I(U;X)}$ with respect to $Q_{XY}$, the per-letter side information and source distribution. In comparison, using standard achievability arguments based on the method of types, we upper-bound the $c$-dispersion as the variance of $c\imath_{Y|U}(Y|U)+\imath_{U;X}(U;X)$, which improves the existing upper bounds but has a gap to the aforementioned lower bound.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.