A Class of Subadditive Information Measures and their Applications
Abstract: We introduce a two-parameter family of discrepancy measures, termed \emph{$(G,f)$-divergences}, obtained by applying a non-decreasing function $G$ to an $f$-divergence $D_f$. Building on Csiszár's formulation of mutual $f$-information, we define a corresponding $(G,f)$-information measure $ I_{G,f}(X;Y)$. A central theme of the paper is subadditivity over product distributions and product channels. We develop reduction principles showing that, for broad classes of $G$, it suffices to verify divergence subadditivity on binary alphabets. Specializing to the functions $G(x)\in{x,\log(1+x),-\log(1-x)}$, we derive tractable sufficient conditions on $f$ that guarantee subadditivity, covering many standard $f$-divergences. Finally, we present applications to finite-blocklength converses for channel coding, bounds in binary hypothesis testing, and an extension of the Shannon--Gallager--Berlekamp sphere-packing exponent framework to subadditive $(G,f)$-divergences.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.