4

The RSA cryptosystem makes use of $n=pq$ where $p, q$ are large prime numbers.

With quantum computing, factorization might become easier, so it will probably be useful to use much much bigger $p$, $q$ in the future.

If I remember correctly, the source of large prime numbers $p$, $q$ in today's most commons RSA implementations is: probabilistic primalty tests.

In a concrete manner, what could be a possible source of much much larger primer numbers (several order of magnitudes bigger than the $p$, $q$ used today)?

Basj
  • 553
  • 3
  • 23
  • 2
    RSA has died years ago. Long live ECC. Several magnitude order, no one is going to use that power consumer and slow operation. – kelalaka Oct 05 '20 at 10:26
  • 4
  • 1
    the numbers we use right now are chosen at random and then incremented till prime, I see no reason why we cannot still do that, it will just be slower – Richie Frame Oct 05 '20 at 10:51
  • 1
    however, the actual use of such large primes would be... undesirable from a performance standpoint – Richie Frame Oct 05 '20 at 10:52
  • @DannyNiu pqRSA seems to use many somewhat smaller primes. P and Q here are for post-quantum, not prime $p$ and $q$. But besides that, after generating the primes, the rest of the key pair generation and of course modular exponentiation seems to be a doodle when compared to generating the primes. It depends what kind of quantum computer we're protecting against, or course, if this is feasible at all. – Maarten Bodewes Oct 05 '20 at 13:03
  • 1
    @kelalaka: if the threat we're worried about are Quantum Computers, ECC (except for isogeny) isn't the answer - it's just as vulnerable... – poncho Oct 05 '20 at 13:42
  • 1
    @RichieFrame: now, I have heard Shamir suggest unbalanced RSA to be somewhat postquantum; you have two primes $p, q$, with $p$ being much larger than $q$ (e.g. 16000 bit vs 2048 bit). For encryption, you'd pad the message to be a value $< q$ (e.g. 1024 bit), and use $e = 65537$. For decryption (here's the trick), you'd work modulo $q$ (and ignore the $p$ side of things). If you do this, then "how to efficiently find moderately large $p$ primes" is a relevant question. – poncho Oct 05 '20 at 13:54
  • @kelalaka: you didn't, but the original question did. As for proof, well, do you have a proof that solving ECDLog is difficult on a conventional computer? – poncho Oct 05 '20 at 14:42
  • I was meaning that you did it well, sorry the miss written, deleted it. Anyway. Bound known for the generic groups. Since those are not generic groups there is no proof as far as I know. The binary extension fields based ECC has been crippled recently as you know. I was asking for the proof of isogeny's postQC proof or actually bounds. – kelalaka Oct 05 '20 at 14:52
  • @poncho any pointer to this suggestion by Shamir? How old is it? The numbers seem super small. Is there any reason why this unbalanced RSA would seem to resist Shor's algorithm? It seems really unlikely, otherwise we would see a tons of research on post-quantum crypto from this variant, and I never heard anything about that. – Geoffroy Couteau Oct 05 '20 at 16:39
  • 1
    @GeoffroyCouteau: here is the original article: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.154.5763&rep=rep1&type=pdf . Now, the numbers I listed were what I could recall from a talk he gave a few years back (and are likely off); the numbers from the original article are much smaller (but that article is from 25 years ago). – poncho Oct 05 '20 at 16:55
  • 1
    I see, but this article never mentions anything related to quantum computation (actually, the word "quantum" never appears). It is only about increasing RSA security to paranoid security levels without blowing up the cost completely But these "paranoid" parameters are only discussions w.r.t. classical computers; they would still be likely in the range if what is completely destroyed by a quantum computer (and in fact, PQRSA uses much, much larger moduli). – Geoffroy Couteau Oct 05 '20 at 18:02
  • @GeoffroyCouteau: the article doesn't mention postquantum, however Shamir did propose it to try to frustrate Quantum Attacks in a recent talk; the idea was that, if the size of Quantum Computers followed something similar to Moore's law, it should be enough for the forseeable future. Some people (e.g. me) disagreed that depending on the slowness of scaling was a good security assumption; however the idea is there... – poncho Oct 05 '20 at 19:27
  • ok, thanks for the info! – Geoffroy Couteau Oct 05 '20 at 20:10
  • @poncho the question mentioned several orders of magnitude, so I am imagining a prime size of around 409600 bits or larger, I do not think there will be an efficient implementation for that ever – Richie Frame Oct 05 '20 at 21:15

0 Answers0