Papers
Topics
Authors
Recent
Search
2000 character limit reached

Noisy Torn Paper Channel (TPC)

Updated 23 January 2026
  • The Noisy Torn Paper Channel is an information-theoretic model that fragments messages into unordered, noisy pieces with possible deletions.
  • It applies probabilistic combinatorics and coding theory to analyze channel capacity and develop error-correcting codes for applications like DNA storage and forensic recovery.
  • Key decoding methods include marker-based schemes and hierarchical redundancy, which efficiently reconstruct original messages from the unordered multiset of fragments.

The noisy Torn Paper Channel (noisy TPC) encompasses a class of information-theoretic models in which a message is randomly fragmented into unordered pieces, potentially subject to deletions and various forms of noise. The decoding task is to reconstruct the original message from this unordered multiset of potentially noisy fragments. These models capture core features of storage and readout processes in molecular storage (e.g., DNA), forensics, and distributed data systems where fragmentation, shuffling, noise, and partial loss are simultaneously present. Theoretical analysis of the noisy TPC leverages probabilistic combinatorics, coding theory, and information theory to characterize channel capacity, error-correcting code constructions, and algorithmic trade-offs.

1. Formal Definition and Models

The noisy Torn Paper Channel is generally defined as follows: a codeword xΣnx \in \Sigma^n is subjected to (1) a sequence of edit or substitution errors, (2) random or adversarial fragmentation into a multiset of substrings, (3) shuffling (unordered output), and (optionally) (4) deletion of fragments. The decoder receives this unordered, possibly incomplete, noisy set and must recover xx.

Key variants include:

  • Probabilistic TPC: Fragmentation by placing independent random cuts (e.g., Geometric(pn)(p_n) length distribution), each resulting in random-length fragments that are then randomly ordered (Shomorony et al., 2020).
  • Adversarial TPC: Arbitrary placement of a bounded number of breaks and up to tet_e arbitrary edit errors (insertions, deletions, substitutions) before or after fragmentation; all choices are adversarial (Abu-Sini et al., 20 Jan 2026, Bar-Lev et al., 2022).
  • Noisy TPC with Substitutions: Each bit is flipped independently with probability psp_s before fragmentation; each fragment is thus corrupted by a Binary Symmetric Channel (BSC) (Walter et al., 16 Jan 2026, Ravi et al., 2024).
  • TPC with Fragment Deletions: After fragmentation, each fragment is erased with length-dependent or independent probability, modeling partial coverage or physical loss (Ravi et al., 2024).
  • Noisy DPC Model: In the Gaussian setting, "torn paper" refers to additive state-dependent noise with side information available in noisy form at the transmitter/receiver (0901.2934).

The TPC generalizes both the string shuffling channel and multi-fragment molecular storage models.

2. Capacity Results and Operational Characterization

The capacity of the noisy TPC depends crucially on the fragmentation distribution, fragment deletions, and substitution/edit error models. Several key closed-form and asymptotic results have been established:

Noiseless TPC (Random Fragmentation, Binary)

For i.i.d. Geometric(pn)(p_n) fragment lengths with pnlognαp_n \log n \rightarrow \alpha, the asymptotic channel capacity is

C=eαC = e^{-\alpha}

bits per symbol, where α\alpha encodes the expected breakage rate (Shomorony et al., 2020). This is strictly greater than that for deterministic fragmentation into equal-sized blocks: Cdet=(1α)+C_\mathrm{det} = (1 - \alpha)^+ so random-length fragments yield higher capacity, a consequence of occasional long fragments that increase decodability.

Noisy TPC with Substitutions and Deletions

Let psp_s be the substitution probability and fragment lengths NiN_i, then for sufficiently long fragments, the capacity admits a universal "coverage minus alignment" form (Ravi et al., 2024):

C=(1H(ps))FAC = (1 - H(p_s)) F - A

where FF is the limiting fraction of input bits covered by surviving, sufficiently long fragments, and AA is the alignment cost—the information lost to ambiguity in fragment order. For deletion probability per fragment dd, FF decreases accordingly. When fragment lengths are large and deletions rare, AA can be made arbitrarily small and C1H(ps)C \approx 1 - H(p_s).

Specifically, for the noisy TPC with psp_s-substitution,

C(ps,α)(1H(ps))eαC(p_s, \alpha) \leq (1 - H(p_s)) e^{-\alpha}

provides an upper bound (Walter et al., 16 Jan 2026), matching the form for ps=0p_s = 0.

Adversarial Models

In the deterministic adversarial TPC—where the message is broken into segments of length =alogn\ell = \lceil a \log n \rceil and each segment is presented unordered—the optimal capacity is

R=11aR^* = 1 - \frac{1}{a}

for any a>1a > 1 (Bar-Lev et al., 2022). This matches the intuitive packing limit dictated by the index length per segment.

When up to tet_e edit errors and tt breaks can occur adversarially, the redundancy lower bound scales as Ω((t+te)log(n/(t+te)))\Omega((t + t_e) \log(n/(t + t_e))), and explicit codes can achieve redundancy within a loglogn\log \log n factor of this bound (Abu-Sini et al., 20 Jan 2026).

3. Code Constructions and Decoding Algorithms

Coding for the noisy TPC involves both error correction within fragments and robust reassembly from unordered, possibly lossy or noisy, multiset output. Two main families have emerged:

Marker-Based Schemes

  • Static indexed markers: Embed unique static patterns (e.g., short binary sequences plus de Bruijn indices) at regular intervals within the message. After fragmentation, these markers enable localization and identification of fragments (Walter et al., 16 Jan 2026, Bar-Lev et al., 2022).
  • Parity bits: Parity-checks derived from block data are attached to each fragment, used to resolve ambiguities when fragment matching is not unique.
  • Nested locality-sensitive hash (LSH) markers: Data-dependent, multi-layer parity-vote or hash bits are assigned to chunks and recursively grouped, facilitating both robust fragment matching and resistance to moderate substitution error (Walter et al., 16 Jan 2026).

Hierarchical Redundancy and Reed–Solomon Protection

Advanced constructions for the adversarial-noise and edit-error TPC utilize:

  • Mutually uncorrelated markers spread across the codeword to anchor fragments.
  • Adjacency matrices and hierarchical hash trees of the marker positions, each protected by systematic Reed–Solomon codes over extended alphabets for resilience to multiple errors and erasures.
  • Multi-stage decoding: (1) recover redundant marker/hierarchy layers via RS-decoding, (2) reconstruct the sequence of markers/anchors, (3) iteratively reconstruct blocks using hierarchical hashes, correcting erasures and errors at each level (Abu-Sini et al., 20 Jan 2026).

Decoding Techniques

  • Beam search: Both static-marker and nested-hash approaches typically employ beam search strategies over fragment placements/configurations, with width tuned to hardware limits and expected fragment distribution (Walter et al., 16 Jan 2026).
  • Greedy affixing: In substitution-only channels, greedily affix fragments in order using markers, then check/repair using block parities or hashes (Abu-Sini et al., 20 Jan 2026).
  • Permutation search: For edit errors, all (t+1)!(t+1)! concatenations of fragments are examined, leveraging redundancy to identify correct assemblies.

Complexity

  • Marker-based and hierarchical-decoding approaches achieve decoding in O(n)O(n) or O(n3)O(n^3) (with polynomial factor in tt) time in low-noise or low-breaks settings (Bar-Lev et al., 2022, Abu-Sini et al., 20 Jan 2026).
  • For adversarial edit channels, total decoding time is O((t+1)!n4)O((t+1)! n^4).

4. Performance Benchmarks and Empirical Evaluation

Experimental and theoretical studies show that:

  • Both static marker schemes and nested local hash-based marker schemes reach practical reconstruction rates exceeding 99%99\% for sequence lengths up to n1440n \approx 1440 and substitution rates ps1.8%p_s \leq 1.8\% (Walter et al., 16 Jan 2026).
  • Data-dependent markers (nested LSH) yield higher rates at low noise (e.g., Stride-2 LSH achieves R0.78R\approx0.78 at ps=0.4%p_s = 0.4\%), but static markers are more robust at higher substitution or adversarial error rates.
  • Beam-search failure, rather than code ambiguity, is the main cause of errors at high α\alpha (expected number of breaks) or high psp_s (Walter et al., 16 Jan 2026).
  • For adversarial edit/break channels, constructed codes achieve redundancy within O(loglogn)O(\log \log n) of the optimal, and list-decodability is polynomial in the break-overrun parameter (ttt' - t) (Abu-Sini et al., 20 Jan 2026).

A summary of performance dependence on channel parameters:

Scheme Subst. Tolerance Redundancy Decoding Complexity
Static Markers Moderate–High O(logn)O(\log n) O(kt2d)O(k t^2 d) (beam search)
Nested LSH Markers Higher <O(logn)<O(\log n) O(Bmaxt2d)O(B_{max} t^2 d)
Hierarchical Redundancy High (edit,T) O((t+te)lognloglogn)O((t+t_e) \log n \log\log n) O((t+1)!n4)O((t+1)! n^4)

5. Extensions: Insertions, Deletions, and Generalized Noise

Practical variants incorporate:

  • Edit Errors: Insertions, deletions, and substitutions before fragmentation are handled via hierarchical code design, mutually uncorrelated markers, and successive RS-decoding (Abu-Sini et al., 20 Jan 2026).
  • Fragment Deletion: Lossy TPCs require fewer markers per block and higher redundancy to correct for missing fragments; the "coverage fraction" FF reflects the erasure probability in the aggregate rate (Ravi et al., 2024).
  • Multi-strand Settings: Codes extend directly to multi-string systems by parallelizing segmentations and indexings (Bar-Lev et al., 2022).
  • Noisy DPC Analogy: In the continuous Gaussian regime, the "noisy torn paper" model becomes a channel with simultaneous additive state (interference) and AWGN, where noisy side information at encoder/decoder is quantified through a residual noise parameter μ\mu (0901.2934).

6. Applications and Open Problems

The noisy TPC and its generalizations model core phenomena in:

  • DNA Data Storage: Fragmentation, unordered molecule sequencing, and sequence decay map directly onto the TPC with substitution and deletion (Shomorony et al., 2020, Walter et al., 16 Jan 2026, Ravi et al., 2024).
  • Molecular Forensics: Partial recovery, short or noisy trace fragments, and unordered sample collection.
  • Robust Communication: Applicable to networks with packet fragment loss/reordering, or physical layer attacks that induce highly non-uniform loss.

Current open problems include: explicit high-rate low-complexity code constructions matching the probabilistic TPC capacity eαe^{-\alpha} for arbitrary alphabets (Shomorony et al., 2020), robust codes for insertion/deletion noise with optimal redundancy (Abu-Sini et al., 20 Jan 2026), and algorithmic advances to reduce beam-search complexity at scale (Walter et al., 16 Jan 2026). A plausible implication is that further integration of marker-based coding with advanced hash-based matching and erasure/error-correction may be necessary to approach TPC theoretical limits for practical storage sizes and error rates.

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Noisy Torn Paper Channel (TPC).