9

This is a follow up to nondeterministic speed-up of deterministic computation.

Is it plausible that nondeterminism (or more generally alternation) would allow a general quadratic speed-up of deterministic computation? Or are there any known implausible consequences for something like $\mathsf{DTime}(n^2) \subseteq \mathsf{NTime}(n)$?

Kaveh
  • 21,577
  • 8
  • 82
  • 183
  • I'm not sure but I think by similar arguments that you used in your previous question we have $$\mathsf{TMSAT}={<a,x,1^n,1^t>:\exists u\in {0,1}^n: s.t. : M_a: outputs:1: on: input: <x,u> : within : t : steps }$$ is not in $\mathsf{DTIME}(n^2/\lg{n})$. Actually $\mathsf{TMSAT}$ is not in $\mathsf{DTIME}(n)$, because $\mathsf{NTIME}(n)\neq\mathsf{DTIME}(n)$, but I don't know any better lower bounds. – Erfan Khaniki May 01 '16 at 18:24
  • @Erfan, my argument doesn't show it is not, it neither shows that it is unlikely, it just shows that proving that it is is unknown and difficult for $\omega(n\lg n)^2$. – Kaveh May 01 '16 at 23:43
  • Yes, you are right. Actually this argument shows that it is hard to prove $\mathsf{DTIME}(n^2)\subseteq \mathsf{NTIME}(n)$. – Erfan Khaniki May 02 '16 at 06:52

2 Answers2

10

Note that even a result along the lines of $\mathsf{DTime}(\tilde{O}(n^2)) \subseteq \mathsf{NTime}(n^{2-\epsilon})$ would violate NSETH as univariate polynomial identity testing (as defined in section 3.2) can be solved in $\tilde{O}(n^2)$ time deterministically, but there doesn't seem to be an obvious way to use nondeterminism to help prove identity.

Joe Bebel
  • 2,295
  • 18
  • 18
2

Consider computing the gcd from the primitive operations of addition, subtraction, division, and remainder. The best known deterministic algorithm is the Euclidean algorithm, which takes logarithmic time in the magnitude of the smaller input (where "time" here means the number of calls to the primitive operations), i.e., linear in the length of the input. This has been conjectured to be optimal.

Whereas, there is a nondeterministic algorithm (Pratt's algorithm) which takes time $\log \log n$, where $n$ is the magnitude of the smaller input. So as far as anyone knows, there's an exponential gap between the best deterministic and nondeterministic complexity.

So of course this is not time on a Turing machine, but it does show the possibility for more dramatic speedups with very natural notions of time.

Siddharth
  • 823
  • 4
  • 13
  • This does not answer the question. There are, of course, many particular languages for which we conjecture that nondeterminism provides a (possibly even exponential) speed-up: for example, all NP-complete languages. However, that’s not the question. The question is whether it is possible that nondeterminism provides nontrivial speed-up for all languages. – Emil Jeřábek May 01 '23 at 15:44