(Von Neumann gave an algorithm that simulates a fair coin given access to identical biased coins. The algorithm potentially requires an infinite number of coins (although in expectation, finitely many suffice). This question concerns the case when the number of coin tosses allowed is bounded.)
Suppose we have $n$ identical coins with bias $\delta=P[Head]-P[Tail]$. The aim is to simulate a single coin toss while minimizing bias.
The simulation must be efficient in the following sense: An algorithm running in polynomial time looks at $n$ random bits and outputs a single bit. The bias of the algorithm is defined as $Bias(A)=|E[A=0]-E[A=1]|$ where the expectation is taken over the distribution defined by $n$ i.i.d bits ${x_1,\ldots,x_n}$ such that $Prob[x_i=1]-Prob[x_i=0]=\delta$.
Which algorithm $A$ running in polynomial time has the least bias $Bias(A)$?
This question seems very natural to me and it is very likely that it has been considered before.
What is known about this problem? Is anything known when a weaker class (in $AC_0$, etc.) of algorithms is considered?
I don't have a proof of the optimality (perhaps it is not optimal) but the reason I guessed so was that it would be true if the expression was instead $\sum_S \hat{f}(S)^2 \delta^{|S|}$ since this is then a convex combination.
– Ramprasad Sep 27 '10 at 05:37