Locally List-Decodable Codes
- Locally List-Decodable Codes (LLDCs) are error-correcting codes that permit fast, local recovery of message symbols even when many errors are present.
- They integrate techniques like tensor composition, high-dimensional expander methods, and affine-invariant algorithms to optimize rate, query complexity, and list size.
- LLDCs have significant applications in distributed storage, PCP constructions, and cryptographic protocols by achieving near-capacity performance under adversarial noise.
A locally list decodable code (LLDC) is an error-correcting code equipped with an efficient randomized algorithm that, given oracle access to a received word corrupted in a fraction of coordinates, can specify for any given message coordinate a short list of possible values such that for every codeword close to the received word, there is a decoder in the list that outputs the correct symbol at almost all positions. This property is a local, sublinear-time analog of classical list decoding, central to both complexity theory (hardness amplification, pseudorandom generator constructions) and to high-efficiency data transmission in the presence of large corruption rates. Locally list decodable codes generalize locally decodable codes (LDCs) and locally correctable codes (LCCs) to the strong adversarial regime.
1. Formal Definitions and Problem Statement
Let denote a block code over alphabet , with rate and minimum relative distance . A code is –locally list-decodable if for any received word agreeing with some codeword on at least fraction of coordinates, there exists a (random) map producing local algorithms such that for every message position and for some , recovers with probability at least $2/3$ using at most queries to (Hemenway et al., 2017). In the more general setting of locally list-recoverable codes, the input is a word of lists , each , , and the decoder recovers message symbols indexed by if there exists a codeword such that except at an fraction of coordinates.
Approximate LLDCs (aLLDCs) permit recovery of on a fraction of , for some fixed (Dikstein et al., 30 Jan 2026). The critical trade-off parameters are code rate as a function of error-tolerance (ideally approaching information-theoretic capacity), query complexity (preferably polylogarithmic or sublinear in ), list size , and efficiency (decoding time and circuit depth).
2. Core Construction Paradigms
Multiple paradigms underpin the construction of LLDCs, each exploiting a distinct combinatorial, algebraic, or high-dimensional expansion property.
Tensor Power and Composition
The tensor power construction applies to a high-rate globally list-recoverable code over to produce : by applying on each axis. For fixed, is globally list-recoverable with rate, distance, and error tolerance inherited as tensor powers. Approximate locality is achieved by querying random low-dimensional slices; pre-encoding with a high-rate locally decodable code (LDC) corrects remaining local errors, yielding truly local list-decoding (Hemenway et al., 2017). The ultimate code is , with query and rate .
High-Dimensional Expander Codes
Recent advances exploit high-dimensional expanders (HDXs), especially coset complexes, to build direct-product codes where vertices correspond to group cosets and codewords are local restrictions consistent on overlapping neighborhoods. Polylog-round belief propagation, combined with explicit low-congestion routing on the HDX, supports local list-decoding in polylogarithmic time, achieving constant rate and error tolerance approaching the information-theoretic bound (Dikstein et al., 30 Jan 2026). This framework provides both polylog-rate (with parameters scaling as , ) and constant-rate codes (rate , ).
Affine-Invariant and Algebraic Geometric Codes
Affine-invariant families such as lifted Reed-Solomon codes admit local list-decoding by leveraging algebraic closure under affine maps and polynomial structure. Given the projection of multivariate codeword polynomials onto univariate polynomials over field extensions, algorithms can perform line-and-plane-based local list-decoding, with error radius matching the Johnson bound and sublinear query complexity, at rates near (Guo et al., 2014).
Locally Repairable and Homomorphism Codes
Locally repairable codes (LRCs) and homomorphism codes support list-decoding via their local structure. For example, for –LRCs, combining local list-decoding on disjoint partitions with global reconciliation exceeds the Johnson bound for many parameter choices (Holzbaur et al., 2018). In the context of group homomorphism codes, certificate-based and combinatorial preprocessing techniques allow for local list-decoding even up to minimum distance for abelian or alternating group domains, at list size and polylogarithmic query (Babai et al., 2018).
3. Main Theorems and Parameter Regimes
Key formal results on the existence and constructions of LLDCs include:
| Construction Paradigm | Rate | Query Complexity | List Size | Alphabet Size | Error Tolerance | Reference |
|---|---|---|---|---|---|---|
| Tensor+LDC (exp/const alpha) | (Hemenway et al., 2017) | |||||
| HDX/coset complex | Binary | Approaches | (Dikstein et al., 30 Jan 2026) | |||
| Lifted Reed-Solomon | as | Up to | (Guo et al., 2014) | |||
| LRC (Tamo–Barg construction) | Singleton-optimal | , large | Exceeds Johnson bound | (Holzbaur et al., 2018) | ||
| Homomorphism codes | Hadamard rate | Polylog | (group-dependent) | Up to mindist | (Babai et al., 2018) |
These results delineate known explicit and non-explicit codes matching or closely approaching the information-theoretic limits on rate versus error fraction for LLDCs, often within constant or polylog factors in query and list size.
4. Local List-Decoding Algorithms and Analysis
Standard LLDC frameworks rely on randomized sublinear sampling coupled with local consistency checks and iterative message-passing or belief propagation to resolve ambiguity. For tensor codes, random slicing and candidate propagation along tensor axes yields approximate list-recovery, which is then boosted via LDC composition (Hemenway et al., 2017). On HDXs, polylog-round local sampling and explicit routing maintain candidate lists at each vertex, pruning by intersection and small-set samplers. Pruning and candidate list management ensure error does not accumulate multiplicatively across rounds; instead, it is controlled additively via local expansion and negative correlation arguments, achieving overall agreement for the correct codeword in at least one candidate decoder (Dikstein et al., 30 Jan 2026).
In affine-invariant codes, the “line-and-plane” stitching algorithm reconstructs codeword values at a target index by repeated univariate and bivariate list-decodes on random lines and planes, exploiting the low-degree structure under field isomorphisms (Guo et al., 2014). For group codes, majority-vote sampling, subgroup decomposition, and combinatorial bucketing are employed to isolate candidate homomorphisms with high agreement with the received word (Babai et al., 2018).
5. Information-Theoretic Barriers and Open Problems
LLDCs, and their approximate variants, are subject to several lower bounds. Any -locally list-decodable code must satisfy (Dikstein et al., 30 Jan 2026). The standard rate–error trade-off is , where is the -ary entropy; most capacity-achieving constructions approach only with rapidly growing alphabet size or unbounded list size.
Key open problems include eliminating the super-constant list-size growth in explicit tensor-LDC approaches, reducing query complexity to polylogarithmic in and , strengthening local correction to codeword symbols (not just message symbols), and achieving expander-based locality without derandomization or heavy precomputation (Hemenway et al., 2017).
6. Applications and Relation to Complexity Theory
LLDCs are central to several applications in probabilistically checkable proofs (PCPs), hardness amplification, pseudorandom generator design, and high-throughput distributed storage. The existence of high-rate, low-locality LLDCs enables capacity-achieving local coding in feasible parallel time (e.g., RNC circuits), optimal randomness-efficient PRGs under mild assumptions, and sub-polynomial-time distance and list-amplification (Dikstein et al., 30 Jan 2026). For instance, composition with good LDCs yields near-optimal parallel list-decodable codes, resolving a suite of classical open questions in coding and complexity theory. In practice, these methods yield efficient coding schemes for distributed storage with strong locality and worst-case resilience, as well as foundational primitives for average-case complexity and cryptographic constructions.
7. Connections, Generalizations, and Future Directions
LLDCs connect to broad areas of combinatorics, group theory, spectral expanders, and algebraic geometry. High-dimensional expander techniques have expanded the scope of list-decodable local codes from algebraic families to combinatorial ones, revealing new trade-offs in locality, rate, and error-tolerance. Affine-invariant and homomorphism-based codes broaden the universality of list-decoding up to the minimum distance, especially for group-theoretic structures.
Research directions include direct derandomized tensorization of high-rate list-recoverable codes, extending local list-decoding to more general algebraic or combinatorial frameworks, and systematic study of certificate list-decoding and domain relaxation in non-solvable groups (Babai et al., 2018). The universality of sublinear-time and parallel algorithms for LLDCs continues to be an area of active development, as does the explicit construction of capacity-achieving codes with optimal locality and efficient parallel decoding.