Two-Bit Mutual Information in AIT
- Two-bit mutual information is defined as I(x:y)=C(x)+C(y)-C(x,y), capturing the shared information between finite binary strings as the maximal secret key length.
- The approach leverages interactive protocols with shared randomness and a Kolmogorov-Slepian–Wolf compressor to enable nearly optimal secret key agreement.
- Tight lower and upper bounds illustrate that achieving optimal key rates necessitates communication near conditional complexity thresholds, with results precise up to logarithmic terms.
The two-bit operational characterization of mutual information in the context of algorithmic information theory (AIT) assigns a concrete, cryptographically meaningful interpretation to Kolmogorov’s plain mutual information for finite binary strings and . Up to logarithmic precision, the mutual information corresponds exactly to the maximal length of a shared secret key that two parties, each holding one string and the complexity profile of the pair, can agree upon when interacting over a public channel using probabilistic protocols. This result provides a quantitative bridge between AIT and the notion of privacy amplification as studied in cryptography, demonstrating that mutual information determines protocol limits for secret key agreement in a one-shot, non-stochastic setting (Romashchenko et al., 2017).
1. Algorithmic Mutual Information and Plain Kolmogorov Complexity
Let denote the plain Kolmogorov complexity of a binary string and the joint complexity of the pair . The algorithmic mutual information is then given by
This definition satisfies and, by the Kolmogorov chain rule,
which ensures all precision errors are within for . The plain and prefix complexity () formulations are interchangeable up to additive slack. This operational view of applies regardless of probability distributions; it characterizes information shared in individual strings (Romashchenko et al., 2017).
2. Communication Model and Complexity Profile
In the proposed model, two parties engage in interactive communication to agree upon a secret key:
- Inputs: Alice is given , Bob is given , where the complexity profile is known to both.
- Protocol: The protocol proceeds in public rounds using a shared random seed . In each round, Alice and Bob exchange messages () where each message depends on their respective input string, the shared randomness, and previous transcripts.
- Secret Key Requirement: Both compute an output using their input, , and the full transcript . The protocol is correct if:
- (i) With probability , both outputs match: .
- (ii) The agreed key is almost incompressible given :
with deficiency . Thus, is uniformly random conditioned on the public transcript (Romashchenko et al., 2017).
3. Optimal Secret Key Length: Achievability and Impossibility
The main result consists of tight lower and upper bounds for achievable key length under the above model.
Lower Bound (Achievability)
For every , there exists a one-round randomized protocol with public randomness such that, for any inputs of length , with probability , Alice and Bob can agree on a key with
and secrecy deficiency . The necessary public communication is
The protocol relies on a Kolmogorov-Slepian–Wolf compressor: Alice sends a “fingerprint” of her input compressible to bits, enabling Bob to reconstruct . Both parties then apply an extractor-based procedure to derive of length near . The output is, up to logarithmic slack, random conditioned on the public transcript and random seed (Romashchenko et al., 2017).
Upper Bound (Impossibility)
No -round protocol with public randomness and error , for any fixed , can succeed with , where is the secrecy deficiency. This follows from an “information non-increase” argument: combinatorial-rectangle lemmas show that
Thus, is a sharp threshold for the length of the shared secret (Romashchenko et al., 2017).
4. Communication Complexity Thresholds
If the protocol communicates fewer than bits for any constants , then for all inputs and any nontrivial success probability , the maximum secret key length
cannot be achieved. This statement is established using deep AIT lower bounds (Muchnik, Razenshteyn) on extractable common information. Therefore, communication below the Slepian–Wolf threshold forces the resulting key to be vanishingly short (Romashchenko et al., 2017).
5. Role of the Complexity Profile
The complexity profile for a pair is
or, equivalently, the tuple . Knowledge of the profile is essential for protocol optimality: it permits each party to determine suitable compression/fingerprint lengths, ensuring that minimal communication suffices for input reconstruction and maximal key agreement. Without the profile, protocol parameters cannot be optimally chosen (Romashchenko et al., 2017).
6. Precision, Limitations, and Operational Significance
All foregoing bounds hold up to additive terms, or when tracking statistical error probability . The achievable protocol is computable but not generally time-efficient, requiring brute-force decodings; in special cases (e.g., strings with bounded Hamming distance), polynomial time is attainable using error-correcting codes in place of Kolmogorov compressors. The upper bound on key length applies only to protocols with public randomness; the private-randomness scenario remains unresolved. Conclusively, the result provides an operational meaning to algorithmic mutual information: up to logarithmic precision,
characterizes the secret key rate for public-channel agreement protocols. This operational and cryptographic perspective on AIT unifies two fundamental notions of information and privacy in a protocol-independent, non-stochastic framework (Romashchenko et al., 2017).