Papers
Topics
Authors
Recent
Search
2000 character limit reached

Random noise increases Kolmogorov complexity and Hausdorff dimension

Published 14 Aug 2018 in cs.IT and math.IT | (1808.04626v3)

Abstract: Consider a binary string $x$ of length $n$ whose Kolmogorov complexity is $\alpha n$ for some $\alpha<1$. We want to increase the complexity of $x$ by changing a small fraction of bits in $x$. This is always possible: Buhrman, Fortnow, Newman and Vereshchagin (2005) showed that the increase can be at least $\delta n$ for large $n$ (where $\delta$ is some positive number that depends on $\alpha$ and the allowed fraction of changed bits). We consider a related question: what happens with the complexity of $x$ when we randomly change a small fraction of the bits (changing each bit independently with some probability $\tau$)? It turns out that a linear increase in complexity happens with high probability, but this increase is smaller than in the case of arbitrary change. We note that the amount of the increase depends on $x$ (strings of the same complexity could behave differently), and give an exact lower and upper bounds for this increase (with $o(n)$ precision). The proof uses the combinatorial and probabilistic technique that goes back to Ahlswede, G\'acs and K\"orner (1976). For the reader's convenience (and also because we need a slightly stronger statement) we provide a simplified exposition of this technique, so the paper is self-contained.

Authors (2)
Citations (3)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.