Papers
Topics
Authors
Recent
Search
2000 character limit reached

Private Information Retrieval Over Gaussian MAC

Published 11 Jan 2020 in cs.IT and math.IT | (2001.03753v4)

Abstract: Consider the problem of Private Information Retrieval (PIR), where a user wishes to retrieve a single message from $N$ non-communicating and non-colluding databases (servers). All servers store the same set of $M$ messages and they respond to the user through a block fading Gaussian Multiple Access Channel (MAC). The goal in this setting is to keep the index of the required message private from the servers while minimizing the overall communication overhead. This work provides joint privacy and channel coding retrieval schemes for the Gaussian MAC with and without fading. The schemes exploit the linearity of the channel while using the Compute and Forward (CF) coding scheme. Consequently, single-user encoding and decoding are performed to retrieve the private message. In the case of a channel without fading, the achievable retrieval rate is shown to outperform a separation-based scheme, in which the retrieval and the channel coding are designed separately. Moreover, this rate is asymptotically optimal as the SNR grows, and are up to a constant gap of $2$ bits per channel use from the channel capacity without privacy constraints, for all SNR values. When the channel suffers from fading, the asymmetry between the servers' channels forces a more complicated solution, which involves a hard optimization problem. Nevertheless, we provide coding scheme and lower bounds on the expected achievable retrieval rate which are shown to have the same scaling laws as the channel capacity, both in the number of servers and the SNR.

Citations (3)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.