Papers
Topics
Authors
Recent
Search
2000 character limit reached

Stochastic Coded Caching with Optimized Shared-Cache Sizes and Reduced Subpacketization

Published 28 Dec 2021 in cs.IT and math.IT | (2112.14114v1)

Abstract: This work studies the $K$-user broadcast channel with $\Lambda$ caches, when the association between users and caches is random, i.e., for the scenario where each user can appear within the coverage area of -- and subsequently is assisted by -- a specific cache based on a given probability distribution. Caches are subject to a cumulative memory constraint that is equal to $t$ times the size of the library. We provide a scheme that consists of three phases: the storage allocation phase, the content placement phase, and the delivery phase, and show that an optimized storage allocation across the caches together with a modified uncoded cache placement and delivery strategy alleviates the adverse effect of cache-load imbalance by significantly reducing the multiplicative performance deterioration due to randomness. In a nutshell, our work provides a scheme that manages to substantially mitigate the impact of cache-load imbalance in stochastic networks, as well as -- compared to the best-known state-of-the-art -- the well-known subpacketization bottleneck by showing its applicability in deterministic settings for which it achieves the same delivery time -- which was proven to be close to optimal for bounded values of $t$ -- with an exponential reduction in the subpacketization.

Citations (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.