Podcast
Root Causes 06: Quantum-Resistant Cryptography


Hosted by
Tim Callan
Chief Compliance Officer
Jason Soroko
Fellow
Original broadcast date
February 19, 2019
The pending cryptographic Quantum Apocalypse requires that we replace the hashing and encryption algorithms used through the internet, enterprise networks, mobile service, and popular devices. Join our experts to learn more about the requirements for quantum-resistant algorithms to survive the Quantum Apocalypse.
Podcast Transcript
Lightly edited for flow and brevity.
With Elliptic Curve Cryptography (ECC) and the algorithms underlying it, it’s the same thing. There is no way to quickly solve points on the curve, and therefore ECC is something that we use today. But more interestingly even than that, Tim, perhaps is that one of the reasons we use ECC is that it has even smaller key sizes and is extremely efficient, which makes it viable for a classical computer to be able to use as an algorithm. We have the exact same things going on with post quantum algorithms.
When considering a particular lattice-based algorithm, we can explain why it’s on the candidate list, mostly because it is very, very efficient. Key sizes, etc. are fairly reasonable. Interesting about it as well is that it also can run on classic computers, and that’s an important point. I’ve heard a lot of people talk about this where the actual encryption itself will also need a quantic computer. That’s what we’re trying to avoid.
But I really do want to get through some of the other properties. It’s not just resistance against quantum computing and quantum algorithms.
Randomness is still part of the equation here. Randomness necessarily means that you’re going to have a whole lot of numbers. If you think about prime numbers, at what size does a classical computer start to have difficulty solving for prime numbers? Well, you have to get into quite large numbers. For the smaller prime numbers, there actually is a list of them, and therefore if you just run through the list. You could probably crack an RSA algorithm fairly rapidly. So therefore, security is going to be derived from the mild worst-case complexity assumption.
In other words, you need to cryptography to be hard to break for all those random choices of keys. Those are also properties of the new, candidate, quantum resistant algorithms that have not been solved yet because nobody has run through all the different mathematical possibilities.
There’s also resistance to side channel attacks, some sort of built-in counter measure that naturally is in or inherently is difficult for people who are trying to attempt to use that as an attack means. Does it have the capability of perfect forward secrecy? What’s its resistance to misuse?
You know that one came up with ECC. You guys listening probably remember one of the first set of published parameters for which curves were good curves that were published by NIST we’re actually thrown away because they were found to be weak. ECC needed to be very carefully studied because of its potential for misuse in terms of the parameters that were chosen. And therefore for a post quantum algorithm, candidates will be selected based on: “Is it resistant to that kind of misuse, or are there very few mathematical possibilities of that actually happening.”
It will take a really long time for each of these to be worked through, and it won’t necessarily be NIST itself. It’s like any scientific community coming together where there’s going to be a lead organization with leading research and then there will be peer review of this research to work through all the different mathematical proofs that will be necessary.
An important point that NIST has made is that at the end result there will probably not being a winner. This isn’t an NBA bracket. At the end of the day we’re probably going to end up with two or three or perhaps even more candidates that will come out as being on top. And probably multiples being used. It’ll be perhaps similar to today where we have RSA algorithms living shoulder to shoulder with ECC.
Wow. This is exciting. What would you say would be a time frame? If you had to guess to get to the point where we’re feeling pretty good about what those go-forward algorithms are and they’re actually being implemented.
If you take a look at the sheer amount of work it’s going to take to determine what these properties actually are and especially all the different mathematical proofs that have to be devised, we’re looking probably at a decade. It may line up fairly well with the point at which quantum computers do sustain a number of qubits which threatens ECC and RSA.
I'm going to ask you a completely unfair question. What is your level of confidence that we will work out these questions – have one or more viable algorithmic choices and have them in place and in production – before we do have a quantum computing crisis?
I think this is maybe where we can end it, Tim, is with kind of an exciting kernel. Things like fully homomorphic encryption, which is when Alice and Bob’s communications never leave an encrypted state and the ability to put Alice and Bob’s communication up into the cloud and be able to take measurements off of it without having to actually fully decrypt it. That kind of property will be potentially possible down the road because some of these candidate algorithms, including lattice-based algorithms, include the property that’ll enable things like fully homomorphic encryption, and which things like RSA and ECC do not.

