Podcast
Root Causes 36: Quantum Apocalypse - The Search for Quantum Resistant Crypto


Hosted by
Tim Callan
Chief Compliance Officer
Jason Soroko
Fellow
Original broadcast date
September 3, 2019
Finding the new quantum-resistant cryptography we will need to replace RSA and ECC is a difficult task requiring the coordinated effort of academics, industry, and government. NIST has stepped in to lead this volunteer community. Join us to learn about this project to discover and vet going-forward crypto candidates, where we stand in the process, and where we go from here.
Podcast Transcript
Lightly edited for flow and brevity.
Here’s the good news. There is work going on right now to get us toward a quantum resistant algorithm or algorithms that will be the future algorithms for what we do today, which is to issue certificates for SSL / TLS authentication and encryption, all those good things we take for granted right now. With RSA-2048, for example, and Elliptic Curve Cryptography (ECC), those cryptographic algorithms are very well-known, very well-used, very mature. Unfortunately, we’re not at that maturity level yet with quantum resistant algorithms. That’s what NIST is currently helping with.
But a year later (it took a year, Tim), a year later, December 2017, sixty nine of those submissions were declared complete and proper for submission into what we’ll call Round One.
I think we’re down now to twenty-six algorithms that are currently being evaluated within Round Two. They’ve advanced to what NIST is calling the post quantum crypto semifinals. Seventeen of those are public key encryption and key agreement algorithms and nine of those are digital signatures.
Now Round Two’s going to be a lot more about implementing capability and so I'm going to rhyme off three things here.
Obviously the size of the encryption key and signatures is important. If it’s just too huge, too large, it’s difficult to implement. Therefore maybe even if it is a strong algorithm and is resistant to attack, it still has to be manageable.
Number two is the time required to encrypt and decrypt on each end of a communication channel to sign messages, verify signatures, etc.
And number three, the total amount of traffic sent over the wire required to complete the encryption or decryption.
Those are the three most important things at Round Two, but as well we’re going to continue to kick the tires as hard as possible and allow people to do mathematical proofs of attack against these algorithms. Algorithms that are successfully attacked are obviously going to be moved to the back
So therefore, I don’t think NIST has ever even said, “Look if some new idea gets dreamt up, we won’t consider it in a third round.” That’s just not the case. In fact, what you’re seeing, and I think this is what NIST is encouraging, is a lot of people who were in the first round who had similar ideas, but one of them may have had, you know, a more efficient key length, another of them might’ve had a stronger security overall, they’ve been asked to merge. Which is an interesting concept.
Especially because of the fact that maybe there are a handful left at the end of the day that have all those properties we want and look implementable. Those are the ones that have become very, very shortlisted.
Now all of a sudden we’ve got this highly visible, very important need for a new set of algorithms where whatever we start implementing is going to have crosshairs on it from the day it shows up in production systems, and it feels like all of this has to happen lickety-split.
Dustin Moody over at NIST, one of his quotes is that a very wide range of mathematical ideas are represented by what he’s calling three different families of algorithms: lattice-based, code-based, and multivariate-based. There’s also a few miscellaneous types as well.
I think what we’re going to see, Tim, is you just said RSA took a long time. ECC took a long time and each one of them would be considered its own class of algorithm. NIST is keeping at least three distinctive groups or more. Like three plus one miscellaneous list. I think it’s in his words that’s the hedge against the possibility that if someone breaks one, we could still use another.
I don’t think the mathematical proof of that came about until the solving of Fermat’s Last Theorem, believe it or not. It took a mathematical Eureka moment before ECC was really considered rock solid. And even when it was considered rock solid, what we learned because of some politics that happened towards the end of that process was that you have to choose your parameters incredibly carefully or otherwise you can unravel it.
The idea that X plus Y must be less than Z or bad things happen. So how much time are we going to need for Y and how much time can we allow for X? Because it seems like the clock is ticking.
But gosh. As I’ve mentioned before, we have much more complexity in our systems. We have more devices. We have more kinds of data. We are rapidly moving into a world where everything needs and has crypto. When your refrigerator has to have crypto, that’s a very different world than it was even when ECC came in.
That whole concept coming out of the University of Waterloo, whenever he’s presenting, one of his conclusion slides quite often is an appeal to the commercial side of the world. Because obviously there’s a ton of academic researchers working on this right now feverishly. The commercial side of the world can’t wait until NIST has finished its process completely.
In order to figure out what kind of tooling do we need to do in order to spit out an SSL certificate that can do this. That’s absolutely the case. And that’s the reason why we’re talking about this, Tim. That’s the reason why behind the scenes we are heavily involved in understanding this, researching it, getting ready for the retooling ourselves so that none of this will come as a surprise when our commercial customers need it.

