Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Riemann Hypothesis - what happens if its true?

  • 15-10-2004 03:49PM
    #1
    Registered Users, Registered Users 2 Posts: 7,581 ✭✭✭


    http://www.guardian.co.uk/uk_news/story/0,,1298728,00.html

    I've a basic enough understanding of understanding of cryptography, would this really have that massive an impact?

    What are the options/alternatives to the current tech.?


Comments

  • Registered Users, Registered Users 2 Posts: 1,636 ✭✭✭henbane


    Discussion here. Consensus seems to be that it shouldn't give cryptanalysts the ability to factor large numbers fast unless the proof has come up with something along those lines along the way. It's an issue but not as big an issue as some of the articles about it would have you believe.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 94,366 Mod ✭✭✭✭Capt'n Midnight


    Reminds me of the film Seakers when the Russian guy tells him "It wasn't us, we don't use prime numbers" ie. it won't affect other methods of crtypography like using matricies.

    Also if you you are cracking a code by finding primes you can cheat - you don't actually need to prove a number is prime iespecially f it a lot quicker to show that it is probably prime. One way was to test large numbers against as few as 30 factors and any that still looked prime could be checked as the solution, so overall you save time because checking probable primes against hashes can be faster than find numbers that are definitley prime.

    Also you can simply increase the key length - 128bits --> 256bits or more. This should be done anyway since every two years processing power doubles.


  • Moderators, Social & Fun Moderators Posts: 10,501 Mod ✭✭✭✭ecksor


    uberwolf wrote:
    http://www.guardian.co.uk/uk_news/story/0,,1298728,00.html

    I've a basic enough understanding of understanding of cryptography, would this really have that massive an impact?

    Proofs of the Riemann Hypothesis pop up every day if you look in the right places. None have been accepted so far, so they're treated with a lot of scepticism for obvious reasons.

    To answer your question in the thread title, for the purposes of the numbers you might use in a pair of RSA keys, I think we know it to be true up to a number that would mean we can assume it is true for the purposes of the primes we use in such computations (look for Andrew M. Odlyzko and zeros on google to get latest upper bound) but I'm not quite sure how to verify that at the minute.

    Anyway, nobody knows for what impact it will have. It will depend on what new insights or techniques arise from the proof itself.


  • Moderators, Social & Fun Moderators Posts: 10,501 Mod ✭✭✭✭ecksor


    Just to nitpick a couple of points here.
    Also if you you are cracking a code by finding primes you can cheat - you don't actually need to prove a number is prime iespecially f it a lot quicker to show that it is probably prime. One way was to test large numbers against as few as 30 factors and any that still looked prime could be checked as the solution, so overall you save time because checking probable primes against hashes can be faster than find numbers that are definitley prime.

    I think Fermat's primality test is more applicable to key generation. Perhaps you mean Fermat's factorisation algorithm and applications to modern sieves.
    Also you can simply increase the key length - 128bits --> 256bits or more. This should be done anyway since every two years processing power doubles.

    Careful!! 128 and 256 are demonstrably insecure in the context of RSA. 1024 seems standard with more and more recommending 2048. 128 and 256 are more routine for symmetric ciphers such as AES where the security should be dependent only on the keysize, but strictly speaking such things are generally algorithm dependent.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 94,366 Mod ✭✭✭✭Capt'n Midnight


    ecksor wrote:
    Careful!! 128 and 256 are demonstrably insecure in the context of RSA. 1024 seems standard with more and more recommending 2048. 128 and 256 are more routine for symmetric ciphers such as AES where the security should be dependent only on the keysize, but strictly speaking such things are generally algorithm dependent.
    It's not so long ago that anything better than 64 bit encryption was subject to an export ban in the US. And now 128bits is considered OK on most secure sites.

    It depends on how long the data needs to remain secret. If you have live stock market quotes, they are worthless after 15 minutes. But, many of your personel details would need to be hidden for a lot longer. So you would also have to factor in the speed of future computers when deciding on key length.

    The point was increasing the key length is one way to help counter the future growth in speed of computers.


  • Advertisement
  • Moderators, Social & Fun Moderators Posts: 10,501 Mod ✭✭✭✭ecksor


    You're missing the point. RSA isn't a symmetric cipher. When was the last time you saw a 128 or 256 bit RSA key? Even 512 and 768 bit have disappeared. RSA's security isn't only dependent upon the keysize in the way that a good symmetric cipher's security is. Comparing symmetric and asymmetric keysize security in that way is fundamentally meaningless.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 94,366 Mod ✭✭✭✭Capt'n Midnight


    http://mathworld.wolfram.com/news/2002-08-07/primetest/
    Faster primality testing does not pose any immediate risk to the security of electronic communication. However, it does open up the possibility for greatly speeding up mathematical computation in many areas of number theory. It is too early to say whether the new algorithm can be implemented in such a way as to be competitive with fast probabilistic methods. But as a method of authoritatively distinguishing probable primes (primes that appear to be prime based on some test or sets of tests but for which this cannot be rigorously established) from actual primes, it may already be one of the fastest algorithms in town!

    http://www.woodmann.com/crackz/Tutorials/Rsa.htm
    (I've seen the maths behind this and its not really for me :-) ). Essentially in sieving you add the log of the prime every such regular distance through initially zeroed memory, for each prime in the factor base. When thats done you pick out places where the accumulation exceeds a threshold; thats where you find quadratic residues divisible by lots of small primes, hence these are more likely to be double-partial, partial or full relations.
    I won't pretend to understand that fully , but finding lots of candidate primes would be useful in factoring ?


  • Moderators, Social & Fun Moderators Posts: 10,501 Mod ✭✭✭✭ecksor


    I'm not well up enough on factoring to say how much, if any, it would help, but it would depend on the algorithm being considered. In any case, you quote yourself that "Faster primality testing does not pose any immediate risk to the security of electronic communication." As the factoring algorithms get better we increase the bitsize of the keys, as you mentioned.


  • Registered Users, Registered Users 2 Posts: 7,581 ✭✭✭uberwolf


    so the crunch is most attempts at cryptoanalysis presume Riemann to be true so that it being proved would have no significant implications.

    I'll come clean, my interest is because I'm looking for a masters thesis topic, and thought twould be interesting to do one on the implications of cryptology suddenly being defunct. E-Commerce masters btw. No meltdown then, good I guess but no thesis topic yet :(;)

    cheers


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 94,366 Mod ✭✭✭✭Capt'n Midnight


    Could you do one on what the probably effect would be ?
    Or would it be worth proposing a computer virus that would be something like SETI at home, but using even more zombied PC's than the US govt could, and instead of looking for one key - look for any match on any of the zombied PC's (or does this only work with hash's).

    You could use virus infection rates to show it could happen - or propose that perhaps more than one of the 5,000 people working on windows 2,000 (cia/mob/israel/CIS etc.) put in a cryptograhic easter egg - how strong would encryption have to be to keep that lot away for the life time of the data. eg: my credit card number is still the same as in 2000, - millions of years of computer timer..


  • Advertisement
  • Moderators, Social & Fun Moderators Posts: 10,501 Mod ✭✭✭✭ecksor


    uberwolf wrote:
    I'll come clean, my interest is because I'm looking for a masters thesis topic, and thought twould be interesting to do one on the implications of cryptology suddenly being defunct. E-Commerce masters btw. No meltdown then, good I guess but no thesis topic yet :(;)

    I think you could adapt that idea slightly and perhaps do an interesting study.

    Lots of legacy equipment (such as banking systems, ATMs etc) in the US were built around DES and weren't so easily changed after DES was no longer considered adequate. Nowadays we have protocols such as SSL/TLS which are algorithm agnostic to a certain extent, but if I'm not mistaken there are many implementations of such protocols which don't have a fallback to another algorithm in the case that the primary algorithm (in this sense I mean algorithm to mean a primitive such as a hash function, or symmetric or asymmetric cipher used for authentication/key exchange or encryption of traffic data respectively) is suddenly broken drastically. Perhaps it is feasible to analyse how well (or not) the e-commerce world is prepared for the sudden breaking of given algorithms in the case of such protocols. I can't find a good reference for the notion that protocols are sometimes implemented in a shortsighted fashion like that so I can't guarantee that I didn't dream that up somewhere.


  • Closed Accounts Posts: 3,354 ✭✭✭secret_squirrel


    Its an interesting point Cap makes - how does the available processing power from something like seti or folding compare to the direct computer power that someone like the US would be able to bring on the decryption of a single message? Are they even in the same ballpark? Assuming that the US has something equal to or 5-10 times better than the current supercomputer top 10.


  • Closed Accounts Posts: 59 ✭✭crashedmind


    uberwolf wrote:
    so the crunch is most attempts at cryptoanalysis presume Riemann to be true so that it being proved would have no significant implications.

    I'll come clean, my interest is because I'm looking for a masters thesis topic, and thought twould be interesting to do one on the implications of cryptology suddenly being defunct. E-Commerce masters btw. No meltdown then, good I guess but no thesis topic yet :(;)

    cheers
    Another such "cryptographic foundation rocking" candidate that's being mooted is Quantum computing.
    The Riemann hypothesis (if proven) allows primes/keys to be found faster for a constant computing power.
    Quantum computing offers significantly increased computing power regardless of the method used.

    But then, if quantum computing becomes ubiquitous the encryption algorithms and key sizes will probably be increased such the speedup from quantum computing is negated.


  • Registered Users, Registered Users 2 Posts: 7,581 ✭✭✭uberwolf


    Another such "cryptographic foundation rocking" candidate that's being mooted is Quantum computing.
    The Riemann hypothesis (if proven) allows primes/keys to be found faster for a constant computing power.
    Quantum computing offers significantly increased computing power regardless of the method used.

    But then, if quantum computing becomes ubiquitous the encryption algorithms and key sizes will probably be increased such the speedup from quantum computing is negated.

    What I was 'hoping' for was current crypto goes **** up, how long till we have quantum on board kinda spiel, implications for all transactions and the destruction of trust...

    a little simplistic on reflection, what the lads have said I think
    is that riemann is assumed to be true for all crypt analysis anyway, so it being proved will be of no consequence.


  • Moderators, Social & Fun Moderators Posts: 10,501 Mod ✭✭✭✭ecksor


    I've seen it written in places that Quantum Computing would render Public Key Cryptography useless, but I don't know enough about it to spot if that's exaggeration or guess work (like above) or an accurate assessment.


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 94,366 Mod ✭✭✭✭Capt'n Midnight


    ecksor wrote:
    I've seen it written in places that Quantum Computing would render Public Key Cryptography useless, but I don't know enough about it to spot if that's exaggeration or guess work (like above) or an accurate assessment.
    Maybe I'm getting mixed up but my understanding was that quantum could be used to create a link that you could not eavesdrop on undetected , (something about not being able to split photons) so you would have a true private network. So maybe it would make public keys obsolete rather than easily breakable.

    Anyway not too sure I'd trust quantum computers due to probabilites of background radiation affecting the results, unless they use lots of forward error correction or at least checksums on every process.


  • Closed Accounts Posts: 59 ✭✭crashedmind


    quantum computing holds the promise of faster computing and more storage. Therefore making existing algorithms easier to crack.

    quantum cryptography is used to secure communications across a link as you've described.


  • Registered Users, Registered Users 2 Posts: 849 ✭✭✭jwt


    Quantum computing allows the computer to be both states at once i.e both a 1 bit and a zero bit at the same time.

    Where it gets intresting is that a mutibit computer can be all possible states at once.

    A 4 bit quantum computer can process all combinations from 0000 to 1111 in one step (clock cycle if you will) It wouldn't in reality but it would be as near as made no difference.

    In theory a 128 bit quantum computer will "crack" a simplistic 128 bit encryption in one step.

    Once quantum computers exist current cryptography methods are defunct.

    The worrying bit is the amount of money america is pumping into research in this area. Whos to say that quantum computers don't already exist albeit in a limited form?


    John


  • Moderators, Recreation & Hobbies Moderators, Science, Health & Environment Moderators, Technology & Internet Moderators Posts: 94,366 Mod ✭✭✭✭Capt'n Midnight


    Looks like we will have to go back to using code words or one time pads then..


  • Registered Users, Registered Users 2 Posts: 4,660 ✭✭✭Gavin


    Quote fro hego on a previous thread on the same topic
    quantum computing is not the "Holy Grail" of cryptanalysis. It's true that quantum computing can undertake massively parallel operations, but the problem lies in extracting the correct information. There are two main types of cryptographic algorithms, symmetric and assymetric. If a quantum computer were built tomorrow all assymetric algorithms based on the difficulty in factoring a composite number composed of two large prime numbers(aka RSA) would be broken. This is because of Shor's algorithm which can break the above in polynomial time. Symmetric algorithms, such as 3-DES, are invulnerable to this approach. Grover's algorithm can search an unstructured database in quadratic time, but this would probably be too slow to break a symmetric algorithm with a large key-space. It could only be used to speed up the search for keys.

    Gav


  • Advertisement
Advertisement