12

We have some theorems about the density of prime numbers, the most famous one is probably the prime number theorem.

My question is

about the density of primes when we choose random numbers from a different distributions. The main example I have is numbers of the form $2^a3^b5^c+1$. Are there theorems similar to the prime number theorem when we restrict the numbers to those of this form?


More generally, let's call a set $A \subseteq \mathbb{N}$ efficiently enumerable if there is a (deterministic) polytime algorithm enumerator for $A$, i.e. a polytime algorithm that whose range is exactly $A$ (and for simplicity lets assume that the enumeration is also injective so every member of $A$ appears exactly once) (is there a common name for this in the literature?).

Are there efficiently enumerable sets such that a theorem similar to the prime number theorem or something stronger holds for the distribution of primes in them?

Note that if we can find prime numbers in deterministic polynomial time, then the density would be one for the corresponding set, but that is open for now, so can we get closer to density 1 by replacing $\mathbb{N}$ with another efficiently enumerable set?


This came out of a discussion a few days ago about better ways of finding primes in practice. If there is a deterministic polytime algorithm to find primes then the question for the corresponding enumeration would be trivial, but since that is open, one may want to see if there are other efficiently enumerable sets that can be used to find primes in practice. This might be related to the number of random bits that is used by a probabilistic polytime algorithm to generate primes also.

What are the known upperbounds for the number of random bits (used by a probabilistic poly-time algorithm) are known for generating primes?

Kaveh
  • 21,577
  • 8
  • 82
  • 183
  • 1
    I am not sure what you can hope for in the second question about primes in an “efficiently enumerable” set. Although the definition of “efficiently enumerable” is unclear to me (you say the algorithm A is polynomial-time, but polynomial in what?), the set 2ℕ+4 should be efficiently enumerable, and it does not contain any primes. – Tsuyoshi Ito Mar 10 '11 at 12:14
  • 1
    Since primality is in P, isn't the algorithm that enumerates all primes efficient by your definition? Or do you want an algorithm that generates numbers of length $\ell$ in time polylog($\ell$)? You should be clearer in this part. – Peter Shor Mar 10 '11 at 15:29
  • Could you be more specific about what you mean by generating primes? Presumably you mean sampling the prime numbers in a certain range according to a certain distribution, but it would be helpful (to me at least) if you could give both the range you consider (as a function of $n$) and the probability distribution you need (does it have to be uniformly at random, or can the probability of generating a particular prime vary as a function of it's distance from other primes according to some metric?). – Joe Fitzsimons Mar 10 '11 at 15:45
  • @Tsuyoshi, @Peter: I assume what Kaveh meant is that there is a polynomial-time function $f$ (polynomial in the size of its input, as usual) such that it's range is exactly $A$, i.e. $f(\Sigma^*) = A$. This notion is related to, but not the same as, P-printability (see e.g. http://ftp.cs.rutgers.edu/pub/allender/pprint.pdf). – Joshua Grochow Mar 10 '11 at 17:23
  • The question was orginally for the $2^a3^b5^c+1$ case, I tried to generalize it but I am OK with modifying to makes the question more reasonable. @Tsuyoshi, yes, but that is not an uninteresting case. The intresting case is when there are "enough" primes in $A$. @Peter, it seems to me that deciding if something is in a set in poly-time is not equivalent to enumerating/generating them in polynomial time (with the restriction that each member appears exactly once in the enumeration, without this restriction they would be equivalent), are they? (more) – Kaveh Mar 10 '11 at 18:20
  • (continued) The range is exactly $A$ and each element of $A$ appears exactly once, I have not put a restriction on the size but I am OK with assuming that the enumeration is in the increasing order. @Joe, the inputs to the enumerator are uniformly chosen, i.e. the distribution is an element of $A$ uniformly at random. The original thing that we were interested was some kind of theorem that would say that it is "better" to choose a number from $A$ than a number from $\mathbb{N}$ to find primes of required length where $A$ is numbers of the form $2^a3^b5^c+1$. – Kaveh Mar 10 '11 at 18:25
  • @Joshua, thanks for the link, that is interesting, the definition of $\mathbf{P}$-printable seems similar but it requires printing all memebers of $A$ up to size $n$ in poly-time and there might be exponentially many of them. I think they would be equivalent if there are only polynomially many members of $A$ of size (at most) $n$. – Kaveh Mar 10 '11 at 18:30
  • @Kaveh: Even if $A$ were sparse ($poly(n)$ many members of size at most $n$) I think your definition implies P-printable (esp. in the variant where the outputs come in increasing order) but I don't see the other direction. In particular, your definition implies that the $n$-th member of $A$ has bit-size at most $(\log n)^k$ (or $n^k$ if you only want your enumerator to work on unary inputs $1^n$). Interesting question! – Joshua Grochow Mar 10 '11 at 18:50
  • I still do not get the second question. “The intresting case is when there are "enough" primes in A.” If you restrict to the case where there are many primes in A, what do you want to prove about the distribution of primes in A? It seems to me that you are assuming what you want as a conclusion. – Tsuyoshi Ito Mar 10 '11 at 22:20
  • @Tsuyoshi, let me rephrase it, are there $A$s which satisfy a theorem similar to the prime numbers theorem or something stronger? How large can the density of primes can be when the numbers are chosen from such an $A$? If we can generate primes in polytime then the density would be 1, but that is open, so how close can we get to 1 by chosen numbers from an $A$? It is a relaxation of finding primes in polytime. – Kaveh Mar 10 '11 at 22:56
  • @Kaveh: Now I understand what the second question is. Maybe you want to update the wording of the question? – Tsuyoshi Ito Mar 10 '11 at 23:01
  • This may be an interesting theoretical question (I don't know), but how would it help in practice? As @Peter Shor notes, primality testing is in P, and the density of primes is known to decrease as $1/n$ for numbers of length $n$, so one can find and verify a prime of length $n$ in randomized polynomial time just by checking $N, N+1, N+2, ...$ for a randomly chosen $N$ of length $n$. What qualitative improvement would having a polynomially denser set of candidates produce? – mjqxxxx Mar 11 '11 at 19:20
  • @mjqxxxx, e.g. number of random bits needed, or checking smaller number of candidates. – Kaveh Mar 11 '11 at 19:21
  • 6
    @Kaveh: if you just require the primes in A to have the same density as the primes in N, then the quantitative version of Dirichlet's theorem says that all arithmetic progressions which aren't obviously composite have the correct density of primes. If you require that the set A have an asymptotically higher density of primes, i.e., $\omega(1/n)$ for numbers of length n, I suspect this is an interesting number theory question. – Peter Shor Mar 11 '11 at 20:13
  • Thanks @Peter. (I will repost it on MO after a few days.) – Kaveh Mar 14 '11 at 14:54

0 Answers0