Papers
Topics
Authors
Recent
Search
2000 character limit reached

Zero-Shot Knowledge Base Resizing for Rate-Adaptive Digital Semantic Communication

Published 2 Feb 2026 in cs.IT | (2602.01829v1)

Abstract: Digital semantic communication systems, which often leverage the Vector Quantized Variational Autoencoder (VQ-VAE) framework, are pivotal for future wireless networks. In a VQ-VAE-based semantic communication system, the transmission rate is directly governed by the size of a discrete codebook known as knowledge base (KB). However, the KB size is a fixed hyperparameter, meaning that adapting the rate requires training and storing a separate model for each desired size -- a practice that is too computationally and storage-prohibitive to achieve truly granular rate control. To address this, we introduce a principled, zero-shot KB resizing method that enables on-the-fly rate adaptation without any retraining. Our approach establishes a global importance ranking for all vectors within a single, large parent KB by uncovering its inherent semantic hierarchy. This is achieved via a three-step framework: 1) embedding KB vectors into hyperbolic space to reveal their hierarchical relationships; 2) constructing a master semantic tree using a minimum spanning tree algorithm; 3) enabling instant resizing by iteratively pruning the least important leaf nodes. Extensive simulations demonstrate that our method achieves reconstruction quality nearly identical to that of dedicated KBs trained from scratch, while demanding only a fraction of the computational budget. Moreover, our approach exhibits superior robustness at very low rates, where conventional KBs suffer from catastrophic failure. Our work resolves a fundamental limitation of VQ-VAE-based semantic communication systems, offering a practical and efficient path toward flexible and rate-adaptive semantic communication.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.