Papers
Topics
Authors
Recent
Search
2000 character limit reached

An unbiased estimate for the mean of a {0,1} random variable with relative error distribution independent of the mean

Published 20 Sep 2013 in math.ST, cs.CC, math.PR, and stat.TH | (1309.5413v2)

Abstract: Say $X_1,X_2,\ldots$ are independent identically distributed Bernoulli random variables with mean $p$. This paper builds a new estimate $\hat p$ of $p$ that has the property that the relative error, $\hat p /p - 1$, of the estimate does not depend in any way on the value of $p$. This allows the construction of exact confidence intervals for $p$ of any desired level without needing any sort of limit or approximation. In addition, $\hat p$ is unbiased. For $\epsilon$ and $\delta$ in $(0,1)$, to obtain an estimate where $\mathbb{P}(|\hat p/p - 1| > \epsilon) \leq \delta$, the new algorithm takes on average at most $2\epsilon{-2} p{-1}\ln(2\delta{-1})(1 - (14/3) \epsilon){-1}$ samples. It is also shown that any such algorithm that applies whenever $p \leq 1/2$ requires at least $0.2\epsilon{-2} p{-1}\ln((2-\delta)\delta{-1})(1 + 2 \epsilon)$ samples. The same algorithm can also be applied to estimate the mean of any random variable that falls in $[0,1]$.

Citations (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.