Papers
Topics
Authors
Recent
Search
2000 character limit reached

Privacy Guarantees in Posterior Sampling under Contamination

Published 12 Mar 2024 in math.ST and stat.TH | (2403.07772v2)

Abstract: In recent years differential privacy has been adopted by tech-companies and governmental agencies as the standard for measuring privacy in algorithms. In this article, we study differential privacy in Bayesian posterior sampling settings. We begin by considering differential privacy in the most common privatization setting in which Laplace or Gaussian noise is simply injected into the output. In an effort to achieve better differential privacy, we consider adopting {\em Huber's contamination model} for use within privacy settings, and replace at random data points with samples from a heavy-tailed distribution ({\em instead} of injecting noise into the output). We derive bounds for the differential privacy level $(\epsilon,\delta)$ of our approach, without the need to impose the restriction of having a bounded observation and parameter space which is commonly used by existing approaches and literature. We further consider for our approach the effect of sample size on the privacy level and the convergence rate of $(\epsilon,\delta)$ to zero. Asymptotically, our contamination approach is fully private at no cost of information loss. We also provide some examples depicting inference models that our setup is applicable to with a theoretical estimation of the convergence rate, together with some simulations.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.