On the Finiteness of the Capacity of Continuous Channels
Abstract: Evaluating the channel capacity is one of many key problems in information theory. In this work we derive rather-mild sufficient conditions under which the capacity is finite and achievable. These conditions are derived for generic, memoryless and possibly non-linear additive noise channels. The results are based on a novel sufficient condition that guarantees the convergence of differential entropies under point-wise convergence of Probability Density Functions. Perhaps surprisingly, the finiteness of channel capacity holds for the majority of setups, including those where inputs and outputs have possibly infinite second-moments.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.