Trading information complexity for error II: the case of a large error and external information complexity
Abstract: Two problems are studied in this paper. (1) How much external or internal information cost is required to compute a Boolean-valued function with an error at most $1/2-\epsilon$ for a small $\epsilon$? It is shown that information cost of order $\epsilon2$ is necessary and of order $\epsilon$ is sufficient. (2) How much external information cost can be saved to compute a function with a small error $\epsilon>0$ comparing to the case when no error is allowed? It is shown that information cost of order at least $\epsilon$ and at most $h(\sqrt{\epsilon})$ can be saved. Except the $O(h(\sqrt{\epsilon}))$ upper bound, the other three bounds are tight. For distribution $\mu$ that is equally distributed on $(0,0)$ and $(1,1)$, it is shown that $IC{ext}_\mu(XOR, \epsilon)=1-2\epsilon$ where XOR is the two-bit xor function. This equality seems to be the first example of exact information complexity when an error is allowed.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.