36

The Complexity Zoo points out in the entry on EXP that if L = P then PSPACE = EXP. Since NPSPACE = PSPACE by Savitch, as far as I can tell the underlying padding argument extends to show that $$(\text{NL} = \text{P}) \Rightarrow (\text{PSPACE} = \text{EXP}).$$ We also know that L $\subseteq$ NL $\subseteq$ NC $\subseteq$ P via Ruzzo's resource-bounded alternating hierarchy.

If NC = P, does it follow that PSPACE = EXP?

A different interpretation of the question, in the spirit of Richard Lipton: is it more likely that some problems in P cannot be parallelized, than that no exponential-time procedure requires more than polynomial space?

I would also be interested in other "surprising" consequences of NC = P (the more unlikely the better).

Edit: Ryan's answer leads to a further question: what is the weakest hypothesis that is known to guarantee PSPACE = EXP?

  • W. Savitch. Relationships between nondeterministic and deterministic tape complexities, Journal of Computer and System Sciences 4(2):177-192, 1970.
  • W. L. Ruzzo. On uniform circuit complexity, Journal of Computer and System Sciences 22(3):365-383, 1971.

Edit (2014): updated old Zoo link and added links for all other classes.

András Salamon
  • 19,000
  • 3
  • 64
  • 150
  • 1
    As I'm sure I'm not the only one to not know what NC is, here's a link: http://en.wikipedia.org/wiki/NC_%28complexity%29 – Emil Sep 24 '10 at 22:24
  • 1
    @Andras: One other consequence which maybe you already know, but hasn't been mentioned yet, is that then the $\mathsf{NC}$ hierarchy would collapse, since $\mathsf{P}$ has complete problems under $\mathsf{L}$-reductions. – Joshua Grochow Dec 20 '17 at 20:43

3 Answers3

28

Yes. $NC$ can be seen as the class of languages recognized by alternating Turing machines that use $O(\log n)$ space and $(\log n)^{O(1)}$ time. (This was first proved by Ruzzo.) $P$ is the class where alternating Turing machines use $O(\log n)$ space but can take up to $n^{O(1)}$ time. For brevity let's call these classes $ATISP[(\log n)^{O(1)},\log n] = NC$ and $ASPACE[O(\log n)] = P$.

Suppose the two classes are equal. Replacing the $n$ with $2^n$ in the above (i.e., applying standard translation lemmas), one obtains

$TIME[2^{O(n)}] = ASPACE[O(n)] = ATISP[n^{O(1)}, n] \subseteq ATIME[n^{O(1)}] = PSPACE$.

If $TIME[2^{O(n)}] \subseteq PSPACE$ then $EXP = PSPACE$ as well, since there are $EXP$-complete languages in $TIME[2^{O(n)}]$.

Edit: Although the above answer is perhaps more educational, here's a simpler argument: $EXP = PSPACE$ already follows from "$P$ is contained in polylog space" and standard translation. Note "$P$ is contained in polylog space" is a much weaker hypothesis than $NC = P$.

More details: Since $NC$ circuit families have depth $(\log n)^c$ for some constant, every such circuit family can be evaluated in $O((\log n)^c)$ space. Hence $NC \subseteq \bigcup_{c > 0} SPACE[(\log n)^c]$. So $P = NC$ implies $P \subseteq \bigcup_{c > 0} SPACE[(\log n)^c]$. Applying translation (replacing $n$ with $2^n$) implies $TIME[2^{O(n)}] \subseteq PSPACE$. The existence of an $EXP$-complete language in $TIME[2^{O(n)}]$ finishes the argument.

Update: Addressing Andreas' additional question, I believe it should be possible to prove something like: $EXP=PSPACE$ iff for all $c$, every polynomially sparse language in $n^{O(\log^c n)}$ time is solvable in polylog space. (Being polynomially sparse means that there are at most $poly(n)$ strings of length $n$ in the language, for all $n$.) If true, the proof would probably go along the lines of Hartmanis, Immerman, and Sewelson's proof that $NE = E$ iff every polynomially sparse language in $NP$ is contained in $P$. (Note, $n^{O(\log^c n)}$ time in polylog space is still enough to imply $PSPACE=EXP$.)

Ryan Williams
  • 27,465
  • 7
  • 115
  • 164
  • 1
    Thanks for the nice answer. Dexter Kozen's Theory of Computation has a nice "uniform" notation for Ruzzo's classes on page 69: $STA(f,g,h)$ where $f$ bounds space, $g$ bounds time, and $h$ bounds alternations. Then $\text{NC} = STA(\log n, {}, (\log n)^{O(1)})$ while $\text{P} = STA(\log n, {}, {*})$ which really highlights the construction. – András Salamon Sep 24 '10 at 18:24
  • 1
    Note that I am saying $NC = STA(\log n, (\log n)^{O(1)}, *)$ in the above. However I think these are the same. A machine which takes polynomial time and $O(\log n)$ space but makes only $(\log n)^{O(1)}$ alternations can be turned into another alternating machine which takes only $(\log n)^{O(1)}$ time and $O(\log n)$ space. (The other direction is obvious.) The idea is to insert more alternations so that each polynomial time existential phase and universal phase is "sped up" to run in only $(\log n)^{O(1)}$ time and $O(\log n)$ space, along the lines of Savitch's theorem. – Ryan Williams Sep 24 '10 at 19:37
  • 6
    what we need is some kind of greasemonkey script that automatically links something like "\NP" to the entry in the zoo. – Suresh Venkat Sep 25 '10 at 18:01
  • @RyanWilliams Can P be described by fan-in 2 circuits of poly(n) depth? – Turbo Aug 18 '23 at 20:58
12

(I've seen Ryan's answer, but I just wanted to provide another perspective, which was too long to fit into a comment.)

In the $L = P \Rightarrow PSPACE = EXP$ proof, all that you need to know about L, informally, is that when blown up by an exponential, L becomes PSPACE. The same proof goes through for NL, because NL blown up by an exponential also becomes PSPACE.

Similarly, when NC is blown up by an exponential, you do get PSPACE. I like to see this in terms of circuits: NC is the class of polynomial size circuits with polylog depth. When blown up, this becomes exponential size circuits with polynomial depth. One can show that this is exactly PSPACE, once the appropriate uniformity conditions are added in. I guess if NC is defined with L-uniformity, then this will get PSPACE-uniformity.

The proof should be easy. In one direction, take a PSPACE-complete problem like TQBF and express the quantifiers using AND and OR gates of exponential size. In the other direction, try traversing the polynomial depth circuit recursively. The stack size will be polynomial, so this can be done in PSPACE.

Finally, I came up with this argument when I saw the question (and before reading Ryan's answer), so there might be bugs. Please point them out.

Robin Kothari
  • 13,617
  • 2
  • 60
  • 116
2

Here's a little more detail from the perspective of simulating time-space bounded Alternating Turing machine.

Suppose that $P = NC$.

Since $NC = ATISP((\log(n))^{O(1)}, O(\log(n)))$, we get $$P = ATISP((\log(n))^{O(1)}, O(\log(n))).$$

Now, consider the linear time universal simulation problem $LinU$ where we are given an encoding on a Turing machine $M$ and an input string $x$ of length $n$ and we want to know if $M$ accepts $x$ in at most $n$ steps.

We know that $LinU \in P$. Therefore, there exists a constant $c$ (sufficiently large) such that $$(*) \; LinU \in ATISP(\log^c(n), c\log(n)).$$

As a result of a padding argument (a little tricky see comments), we have $$(1) \; DTIME(n) \subseteq ATISP(\log^c(n), c\log(n)).$$

Extending the padding argument, we get $$(2) \; DTIME(n^k) \subseteq ATISP(k^c\log^c(n), kc\log(n)).$$ $$(3) \; DTIME(2^{n^k}) \subseteq ATISP(k^cn^{kc}, kcn^{k}).$$

Further, there are known results about the simulation of Alternating time-space bounded Turing machines. In particular, we know that $$ATISP(\log^c(n), c\log(n)) \subseteq DSPACE(O(\log^{c+1}(n))).$$

Therefore, we (essentially) have the following for all natural numbers $k$:

$$(2^{*}) \; DTIME(n^k) \subseteq DSPACE(k^{c+1}\log^{c+1}(n))$$ $$(3^{*}) \; DTIME(2^{n^k}) \subseteq DSPACE(n^{k(c+1)}).$$

From $(3^{*})$, we would get that $EXP = PSPACE$.

====================After Thought===================

It is important to notice that $P = NC$ implies $$ATISP((\log(n))^{O(1)}, O(\log(n))) = ATISP(\log^c(n), O(\log(n)))$$ for some constant $c$.

Any comments or corrections are welcomed. :)

Michael Wehar
  • 5,127
  • 1
  • 24
  • 54
  • The mistake is in the step that you move from LinU to LinTime: you ignore the overhead caused by reductions to LinU from LinTime machines. Just because a universal machine for the class C is in class C' doesn't mean C is a subset of C'. – Kaveh Nov 19 '14 at 05:05
  • Hi Kaveh, thank you for the response. First let me describe the issue, then I'll try to clear it up. Now, say we are given a $2n$ time bounded TM $M$. Let's reduce $L(M)$ to $LinU$. Let $x$ be given. We can't just check if $(M,x)$ is in $LinU$. We have to modify $M$ slightly to obtain $M'$ and pad $x$ with $n$ characters to obtain $x'$ which has $2n$ characters in total. Then, we check if $(M',x')$ is in $LinU$. For the reduction, we map $x$ to $(M',x')$. We should be able to do this in linear time, but wait we need it to be done in $\log^c(n)$ time. – Michael Wehar Nov 19 '14 at 06:42
  • Now, I'll propose a way to clear it up, but honestly, your input is appreciated because there is still more for me to learn on models of alternating machines with random access tapes. – Michael Wehar Nov 19 '14 at 06:47
  • Some alternating machine $A$ can solve $LinU$ in $\log^c(n)$ time. Rather than computing a many one reduction like I described, let's just build a machine $A'$ that simulates $A$ on input $(M',x')$. Since $M'$ is fixed, it can be hard coded into $A'$ with no problems. The input for $A'$ will be $x$, but we need quick access to $x'$ which is $x$ with a length $n$ padding of say 0's. – Michael Wehar Nov 19 '14 at 06:55
  • So let's use random access to compute $n$ in binary using a form of binary search. This should take $\log^2(n)$ time. Now, $A'$ runs similar to $A$, but anytime that $A$ would use random access, $A'$ first compares the look-up address to $n$ (which was written in binary) to decide whether the character at that address is in $x$, the padding, or a blank symbol. This should blow-up the runtime by a $c\log(n)$ factor. Generalizing this construction appears to make (2) still work out (at least with just an extra log factor and larger constants). I'll review it again soon to make sure. – Michael Wehar Nov 19 '14 at 06:56
  • Let me simplify the argument: CV is complete for P w.r.t. AC^0 reductions. If P=NC then NC collapses to some NC^i since they are closed under AC^0 reductions. – Kaveh Nov 19 '14 at 17:28
  • Hi Kaveh, did I clear up the 'mistake'? It appears to work fine to me, but you expressed some doubt and your opinion is valued. – Michael Wehar Nov 22 '14 at 22:10
  • I didn't finish reading your comments, I started but then I felt it is unnecessarily complicated and stopped reading. It can be fixed probably, but it seems like a long detour. As I wrote, the "after thought", which most of your post is devoted to proving if I understand correctly, is kind of obvious if you notice that the right-hand side is just NC^i: it is like PH=PSpace implying PH collapses. (I kind of fail to see how your answer helps in understanding what Ryan and Robin wrote.) – Kaveh Nov 23 '14 at 01:01
  • I intended to provide some more detail on what happens if NC=P. I've been thinking a bit about time-space trade-offs for alternating Turing machines so that influenced my perspective and was why I wrote the after thought. The comments were intended to give supplemental detail on the padding and help remedy any mistake. Thank you again, I appreciate the reply. – Michael Wehar Nov 23 '14 at 06:24
  • 1
    @MichaelWehar Do we know $NC^k\subsetneq PSPACE$ at any fixed $k$? In particular do we know $NC^2\subsetneq PSPACE$ and therefore $NC\neq PSPACE$? – Turbo Apr 29 '18 at 14:15
  • @Turbo Thank you for the comment!! Correct me if I'm wrong, but it seems that we have $$NC = ATISP((\log(n))^{O(1)}, O(\log(n)))$$ and by applying a variation of the time hierarchy theorem along with the relationship between alternating time and deterministic space we have, $$ATIME(n) \subsetneq PSPACE.$$ By combining these two statements, we get $$NC = ATISP((\log(n))^{O(1)}, O(\log(n))) \subseteq ATIME(n) \subsetneq PSPACE.$$ – Michael Wehar Apr 29 '18 at 22:08
  • 1
    @MichaelWehar I do not know but I have never seen anywhere that $NC\neq PSPACE$. In fact a comment in https://cstheory.stackexchange.com/questions/39046/complexity-class-equalities-on-the-edge-of-inconsistency says $P-uniform NC^1=PSPACE$ is possible. I have posted a clarification query in https://cstheory.stackexchange.com/questions/40689/from-p-ch-to-p-pspace. Do you think you can take a look? – Turbo Apr 29 '18 at 23:49
  • 1
    @Turbo Thank you very much for the kind reply!! It may depend on the kind of uniform. For example, $NC = ATISP((\log(n))^{O(1)}, O(\log(n)))$ might only hold for Logspace-uniform NC. Let me think about it and get back to you. :) – Michael Wehar Apr 30 '18 at 16:46
  • @Turbo I took a closer look at this paper: https://www.sciencedirect.com/science/article/pii/0022000081900386 – Michael Wehar May 01 '18 at 15:44
  • @Turbo Their first notion of uniform requires that the space used for constructing the circuits be bounded. For their notion of uniform $NC^k$, the space would be bounded by $log^k(n)$. Using this notion of uniform, they prove that uniform $NC = ATISP((\log(n))^{O(1)}, O(\log(n)))$. – Michael Wehar May 01 '18 at 15:48
  • @Turbo As a result of my argument above, we have that their uniform $NC \subsetneq PSPACE$. However, $P$-uniform $NC$ is different from their uniform $NC$. As a result, it could still be the case that $P$-uniform $NC = PSPACE$. – Michael Wehar May 01 '18 at 15:49
  • @MichaelWehar Sorry what is the difference between uniform $NC$ and $P$-uniform $NC$ and why is $PSPACE\neq NC$ absent in https://en.wikipedia.org/wiki/NC_(complexity)? So we know $NC\subsetneq PSPACE\subseteq EXP$? – Turbo May 02 '18 at 01:42
  • 1
    @Turbo Thank you for the follow-up!! I really think you should read the definition at the bottom of page 370 from: https://www.sciencedirect.com/science/article/pii/0022000081900386 – Michael Wehar May 02 '18 at 02:07
  • @Turbo This definition describes the notion of $U_B$-uniform circuits. We have $U_B$-uniform $NC \subseteq ATISP((\log(n))^{O(1)}, O(\log(n)))$, but we don't know if $P$-uniform $NC \subseteq ATISP((\log(n))^{O(1)}, O(\log(n)))$. As a result, we have $U_B$-uniform $NC$ is a strict subset of $PSPACE$, but it is still not known if $P$-uniform $NC$ is a strict subset of $PSPACE$. Does that make sense? Please let me know if you have any further questions. Hope that you have a nice evening!! :) – Michael Wehar May 02 '18 at 03:18
  • @MichaelWehar 1. Why is your proof $U_B$ uniform is proper subset of $PSPACE$ and what would your proof need to improve this to $U_{BC}$ uniform and $P$-uniform are proper subset of $PSPACE$? – Turbo May 02 '18 at 06:26
  • Where is $U_{BC}$ uniform? Is it also not known to be proper subset of $PSPACE$ like $P$-uniform? Is $U_B$-uniform in $U_{BC}$-uniform in $P$_uniform?
  • – Turbo May 02 '18 at 06:29
  • When wiki or we regular talk about $NC$ do they or we talk about $P$, $U_B$ or $U_{BC}$ uniform?
  • – Turbo May 02 '18 at 06:29
  • What is the largest known $X$ such that $X$-uniform $NC$ is proper subset of $PSPACE$?
  • – Turbo May 02 '18 at 06:30