2

Many believe derandomization with polynomial overhead, $\mathsf{P} = \mathsf{BPP}$, because it follows from $2^{\Omega(n)}$ circuit lower bounds for $\mathsf{E}$ (IW97).

Do we have any evidence for or against even stronger (mildly average-case) derandomization? In particular, I'm interested in the question $\mathsf{BPTIME}(t) \subseteq \mathsf{ioHeur}_{\mathsf{o}(1)}\mathsf{DTIME}(t \cdot \log(t)^{\Theta(1)})/\log(t)$ for say a quasipolynomial time bound $t$ but any pointers would be helpful.

The question [https://cstheory.stackexchange.com/questions/39227/fine-grained-complexity-of-bpp] is related but unfortunately unanswered. The closest connection there would be Dmytro's guess that $\mathrm{BPTime}(O(n^a))⊄\mathrm{Time}(O(n^{a+1-ε}))$.

Nicholas Brandt
  • 223
  • 1
  • 4

1 Answers1

3

There are some recent works on this topic, for example [DMOZ20], [CT21a], and [CT21b].

For worst-case derandomization: following [DMOZ20], [CT21a] showed that under plausible hardness assumption (something similar to E requires $2^{\Omega(n)}$-size circuits, but quantitatively stronger; and plus OWFs exist), BPTIME[$T$] is contained in TIME[$T \cdot n^{1+o(1)}$], and this factor of $n$ is tight under an assumption caled NSETH (Nondeterministic Strong Exponential Time Hypothesis).

For average-case derandomization (say over uniform distribution) as you mentioned, [CT21a] also showed that under similar assumptions you can derandomize BPTIME[$T$] in TIME[$T \cdot n^{o(1)}$] in average. I beleive if you make the assumption significantly stronger you can improve $T \cdot n^{o(1)}$ to $T \cdot \mathrm{polylog}(T)$ as you want. [CT21b] also studied average-case derandomization over any polynomial-time samplable distribution.

Lijie Chen
  • 176
  • 5