Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Single-Letter Upper Bound on the Feedback Capacity of Unifilar Finite-State Channels

Published 7 Apr 2016 in cs.IT and math.IT | (1604.01878v1)

Abstract: An upper bound on the feedback capacity of unifilar finite-state channels (FSCs) is derived. A new technique, called the $Q$-contexts, is based on a construction of a directed graph that is used to quantize recursively the receiver's output sequences to a finite set of contexts. For any choice of $Q$-graph, the feedback capacity is bounded by a single-letter expression, $C_\text{fb}\leq \sup I(X,S;Y|Q)$, where the supremum is over $P_{X|S,Q}$ and the distribution of $(S,Q)$ is their stationary distribution. It is shown that the bound is tight for all unifilar FSCs where feedback capacity is known: channels where the state is a function of the outputs, the trapdoor channel, Ising channels, the no-consecutive-ones input-constrained erasure channel and for the memoryless channel. Its efficiency is also demonstrated by deriving a new capacity result for the dicode erasure channel (DEC); the upper bound is obtained directly from the above expression and its tightness is concluded with a general sufficient condition on the optimality of the upper bound. This sufficient condition is based on a fixed point principle of the BCJR equation and, indeed, formulated as a simple lower bound on feedback capacity of unifilar FSCs for arbitrary $Q$-graphs. This upper bound indicates that a single-letter expression might exist for the capacity of finite-state channels with or without feedback based on a construction of auxiliary random variable with specified structure, such as $Q$-graph, and not with i.i.d distribution. The upper bound also serves as a non-trivial bound on the capacity of channels without feedback, a problem that is still open.

Citations (28)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.