Papers
Topics
Authors
Recent
Search
2000 character limit reached

Segmented GRAND: Complexity Reduction through Sub-Pattern Combination

Published 24 May 2023 in cs.IT, eess.SP, and math.IT | (2305.14892v2)

Abstract: The ordered-reliability bits (ORB) variant of guessing random additive noise decoding (GRAND), known as ORBGRAND, achieves remarkably low time complexity at high code rates compared to other GRAND variants. However, its computational complexity remains higher than other near-ML universal decoders like ordered-statistics decoding (OSD). To address this, we propose segmented ORBGRAND, which partitions the error pattern search space based on code properties, generates syndrome-consistent sub-patterns (reducing invalid error patterns), and combines them in a near-ML order using sub-weights derived from two-level integer partitions of logistic weight. Numerical results show that segmented ORBGRAND reduces the average number of queries by at least 66\% across all SNRs and cuts basic operations by over an order of magnitude, depending on segmentation and code rate. Further efficiency gains come from leveraging pre-generated shared sub-patterns, reducing average decoding time. Furthermore, with abandonment ($b=10{5}$ or smaller), segmented ORBGRAND provides a 0.2 dB power gain over ORBGRAND. Additionally, we provide an analytical justification for why the logistic weight-based ordering of error patterns in ORBGRAND closely approximates the ML order and discuss the underlying assumptions of ORBGRAND.

Citations (2)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.