20

EDIT at 2011/02/08: After some references finding and reading, I decided to separate the original question into two separate ones. Here's the part concerning UP vs NP, for the syntactic and semantic classes part please see Benefits for syntactic and semantic classes.


$\mathsf{UP}$ (the unambiguous polynomial time, see wiki and the zoo for references) is defined as languages decided by $\mathsf{NP}$-machines with an additional constrain that

  • There is at most one accepting computation path on any input.

The precise relations between $\mathsf{P}$ vs $\mathsf{UP}$ and $\mathsf{UP}$ vs $\mathsf{NP}$ are still open. We know that worst-case one-way functions exist if and only if $\mathsf{P} \neq \mathsf{UP}$, and there are oracles relative to all possibilities of the inclusions $\mathsf{P} \subseteq \mathsf{UP} \subseteq \mathsf{NP}$.

I'm interested in why $\mathsf{UP}$ vs $\mathsf{NP}$ is an important question. People tend to believe (at least in literature) that these two classes are different, and my problem is:

If $\mathsf{UP} = \mathsf{NP}$, are there any "bad" consequences happened?

There is a related post on complexity blog in 2003. And if my understanding is correct, the result by Hemaspaandra, Naik, Ogiwara and Selman shows that if

  • There is an $\mathsf{NP}$ language $L$ such that for each satisfiable formula $\phi$ there is a unique satisfying assignment $x$ with $(\phi,x)$ in $L$,

then the polynomial hierarchy collapses to the second level. No such implication is known if $\mathsf{UP} = \mathsf{NP}$ holds.

  • (1) It is easy to see (almost by definition) that UP and BPP have complete problems if “problems” can refer to promise problems. They are not known to have complete languages. (2) I do not know the precise definition of syntactic classes. Is PH syntactic? It does not have a complete problem (even with promise) unless the polynomial hierarchy collapses. (3) I do not know your usage of the notation “PromiseUP.” If NP means the class of languages recognized by an NP machine and PromiseUP means the class of promise problems recognized by a UP machine, then clearly they cannot be equal. – Tsuyoshi Ito Dec 20 '10 at 12:17
  • @Tsuyoshi: Thank you for the questions. (1) By problems I do mean languages, it's my fault that I do not write this clearly. (2) We define syntactic classes as having leaf language characterizations on poly-time machines. PH is special, since no poly-time leaf language characterization is known, where natural complete languages are guaranteed; but PH do have a logspace leaf language characterization. (more) – Hsien-Chih Chang 張顯之 Dec 20 '10 at 12:50
  • (cont.) (3) Maybe the use of PromiseUP is not correct. Here by PromiseUP I mean a class of languages, such that for yes instances the machine has a unique accepting path, and for no instances the machine have zero or at least two accepting paths. – Hsien-Chih Chang 張顯之 Dec 20 '10 at 12:51
  • Thanks for the reply. As for (3), from a cursory look at the paper by Hemaspaandra, Naik, Ogihara and Selman, I cannot find a way to state the result in terms of decision problems. BTW, the link to the paper is broken. Here is a link to the journal version. – Tsuyoshi Ito Dec 20 '10 at 12:59
  • @Tsuyoshi: Is the following reasoning correct: If NP = PromiseUP, then all problems in NP reduce to USAT, which is the problem containing all formulas with a unique assignment. Then, given a formula as input, we can output a unique satisfying assignment in NP by guessing, which satisfies the requirement of the paper [HNOS96]. – Hsien-Chih Chang 張顯之 Dec 20 '10 at 13:31
  • First, the “class of languages, such that for yes instances the machine has a unique accepting path, and for no instances the machine have zero or at least two accepting paths” is called US. See my answer to another question. (more) – Tsuyoshi Ito Dec 20 '10 at 13:55
  • (cont’d) Second, I cannot see how the condition NP=US implies Hypothesis 1.1 in [HNOS96]. What I cannot understand is that your function (required for Hypothesis 1.1) seems to be changing the formula, but it is not allowed. But I must be missing something simple because the introduction of [HNOS96] states that NP=coNP “clearly” implies Hypothesis 1.1 and I cannot see the reason for this either. – Tsuyoshi Ito Dec 20 '10 at 14:04
  • 2
    Just to make sure, PromiseUP is completely different from what you described. As I wrote, PromiseUP is the promise-problem version of UP; that is, it is the class of promise problems having a nondeterministic polynomial-time Turing machine M such that for yes-instances M has exactly one accepting path and for no-instances M does not have accepting paths. Although I believe that PromiseUP is the traditional name for this class, some people (including me) write this class simply as UP. – Tsuyoshi Ito Dec 20 '10 at 14:16
  • @Tsuyoshi: Since US contains coNP, if NP = coNP then the above reasoning may work (we only use the condition that NP $\subseteq$ US). Also, the function here is the NP-machine which guesses the assignments, and the value of the function on x is the set of outputs from accepting paths of the machine. (See defn 2.2 in [HNOS96].) So as the machine does not depend on the input x, the function does not depend on the input formula. – Hsien-Chih Chang 張顯之 Dec 20 '10 at 14:26
  • You are right about the definitions of PromiseUP and US. I'll correct them in the next revision, maybe together with something else, since constantly revise the same problem will disturb other users in TCS SE. – Hsien-Chih Chang 張顯之 Dec 20 '10 at 14:32
  • Well, your explanation of how NP=US implies Hypothesis 1.1 does not resolve my concern (I know the definition of NPSV). I cannot see how to output a satisfying assignment for the given formula instead of outputting some different formula with a unique satisfying assignment. But I will think a little more about it. (Also I guess that this is kind of off topic here.) – Tsuyoshi Ito Dec 20 '10 at 15:11
  • @Tsuyoshi Ito: One can prove $NP=coNP$ implies Hyp. 1.1 directly, without recourse to $US$: Let $M$ be a machine for the $coNP$ language consisting of pairs $(\varphi,w)$ such that $w$ is the lex. least satisfying assignment for $\varphi$. The following $NPSV$ machine outputs satisfying assignments: on input $\varphi$, nondet. guess a witness $w$; apply $M(\varphi,w)$ and output $w$ only on accepting paths of $M$. I believe that if Hsien-Chih's argument worked, the same argument, combined iwth the HNOS result, would show that $NP=UP$ implies $PH$ collapses, which is not known. – Joshua Grochow Jan 09 '11 at 22:45
  • @Joshua: You are right, and I found the mistake in my reasoning, which is precisely what Tsuyoshi has said. I am still reading some related material, hope that after a more complete understanding to the problem I can revise the question properly. Thank you for the proof! – Hsien-Chih Chang 張顯之 Jan 10 '11 at 04:58
  • @Tayfun: sorry for asking, what is a C.C.? complexity class? – Hsien-Chih Chang 張顯之 Feb 10 '11 at 12:01
  • @Tayfun: Since P $\subseteq$ UP and coUP, any complexity class separated from P works? I guess you're interested in a somewhat larger class? – Hsien-Chih Chang 張顯之 Feb 11 '11 at 13:42

1 Answers1

4

It is known that $\mathsf{UP= NP}$ implies $\mathsf{SpanP = \#P}$ since Kobler, Schoning, and Toran proved that $\mathsf{UP= NP}$ if and only if $\mathsf{SpanP = \#P}$. It is easy to see that $\mathsf {\#P}$ is contained in $\mathsf {SpanP}$.

A function $f : Σ^* →\mathbb N$ is in $\mathsf{SpanP}$ if there is an $\mathsf {NP}$ Turing machine transducer $M$ such that for all $x$, $f(x)$ is the number of distinct outputs of $M$ on input $x$.

J. Kobler, U. Schoning, and J. Toran. On counting and approximation, Acta Informatica, 26:363-379, 1989.

Niel de Beaudrap
  • 8,821
  • 31
  • 70
Mohammad Al-Turkistany
  • 20,928
  • 5
  • 63
  • 149