Chain rule for conditional versions of Sibson’s α-mutual information

Ascertain whether any conditional Sibson α-mutual information definition introduced in Section 6—such as I^{Y|Z}_α(X,Y|Z)=min_{Q_{Y|Z}} D_α(P_{XYZ}||P_{X|Z}Q_{Y|Z}P_Z) or I^{Z}_α(X,Y|Z)=min_{Q_Z} D_α(P_{XYZ}||P_{X|Z}P_{Y|Z}Q_Z)—satisfies a Mutual Information-like chain rule of the form I_α(X;(Y,Z)) ≤ I_α(X;Y) + I^{•}_α(X;Y|Z), where I^{•}_α denotes one of these conditional definitions. Establish such a chain rule or provide counterexamples if it fails.

Background

The authors present multiple conditional generalizations of Sibson’s α-mutual information (e.g., minimizing Rènyi divergence over Q_{Y|Z} or Q_Z) and derive closed-form expressions. They verify that these measures recover Shannon’s conditional mutual information in the limit α→1 and connect to conditional maximal leakage for α→∞. However, it remains unclear whether a chain rule analogous to Shannon’s—relating the joint dependence I_α(X;(Y,Z)) to the marginal and conditional terms—holds for any of the proposed definitions.

This open question targets a fundamental structural property of conditional α-information measures and, if resolved, could clarify their suitability as operational analogues to mutual information in settings involving side information or conditioning.

References

Moreover, we do not know whether any of these satisfies a Mutual Information-like chain-rule i.e., I_α(X,(Y,Z))\leq I_α(X,Y) + I?_α(X,Y|Z).

Sibson's $α$-Mutual Information and its Variational Representations  (2405.08352 - Esposito et al., 2024) in Section 6 (Extension to Conditional Sibson’s α-Mutual Information)