Papers
Topics
Authors
Recent
Search
2000 character limit reached

Data-aided Active User Detection with a User Activity Extraction Network for Grant-free SCMA Systems

Published 22 May 2022 in eess.SY, cs.IT, cs.LG, cs.SY, and math.IT | (2205.10780v2)

Abstract: In grant-free sparse code multiple access (GF-SCMA) system, active user detection (AUD) is a major performance bottleneck as it involves complex combinatorial problem, which makes joint design of contention resources for users and AUD at the receiver a crucial but a challenging problem. To this end, we propose autoencoder (AE)-based joint optimization of both preamble generation networks (PGNs) in the encoder side and data-aided AUD in the decoder side. The core architecture of the proposed AE is a novel user activity extraction network (UAEN) in the decoder that extracts a priori user activity information from the SCMA codeword data for the data-aided AUD. An end-to-end training of the proposed AE enables joint optimization of the contention resources, i.e., preamble sequences, each associated with one of the codebooks, and extraction of user activity information from both preamble and SCMA-based data transmission. Furthermore, we propose a self-supervised pre-training scheme for the UAEN prior to the end-to-end training, to ensure the convergence of the UAEN which lies deep inside the AE network. Simulation results demonstrated that the proposed AUD scheme achieved 3 to 5dB gain at a target activity detection error rate of $\bf{{10}{-3}}$ compared to the state-of-the-art DL-based AUD schemes.

Citations (4)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.