PQC Algorithm Selection
The first post-quantum cryptographic (PQC) algorithms have been selected that will provide us the protection society needs in a post-quantum world. While the current state-of-the-art in quantum computers doesn’t pose an immediate threat to existing cryptographic technologies – IBM’s Eagle 127-qubit quantum processor still, by far, falls short of what’s needed to break the popular RSA-3072 public-key cryptosystem – it is expected that this will change in the upcoming decades. The gauntlet for guiding us to the optimal algorithms has been taken up by the National Institute of Standards and Technology (NIST), an agency of the U.S. Department of Commerce. NIST has run previous cryptographic standardization efforts, including selecting the symmetric-key standard (AES) and cryptographic hash functions (SHA-3), which are now used as worldwide standards.
NIST PQC Algorithm Winners Announced
So far, four quantum-resistant algorithms have been selected in this group of winners: one algorithm for key establishment and three algorithms to enable secure digital signatures. Further algorithms considered for standardization are expected to be announced after another selection round. This staged process recognizes the wide range of use cases and situations where cryptography is used and provides alternatives should one algorithm become vulnerable. The sole winner for post-quantum secure key-encapsulation is Crystals-Kyber: an algorithm developed by Charter of Trust members NXP and IBM, along with others, such as ARM. This algorithm builds its security foundation on problems coming from the mathematical domain of structured lattices: for these problems, it is not known how to solve them efficiently using a quantum computer (in contrast to the integer factorization problem, which secures digital signature schemes such as RSA nowadays).
The Risks of Quantum Computing
For as long as there has been cryptography, there have been those studying cryptanalysis: the science of breaking it. Since the late 1970s, encryption has relied on peer-reviewed and widely understood algorithms that are mathematically exceptionally challenging to break using the “classical” computing power, with key-sizes and algorithms evolving as computing capabilities advanced. In 2001, for example, AES was selected in a previous NIST competition when it became clear that specific attack strategies could compromise DES in a matter of days using equipment that was not that expensive.
As research institutes and industry have invested in quantum computing to tackle combinatorial calculations in chemical and biological engineering, quantum algorithms have been proposed that could break current cryptography. Therefore, we find ourselves back in a similar situation as in the early 2000s where it is a question of when, not if, this newfound computing capability will break our current public-key infrastructure.
The difference is that today there is significantly more at stake. Encryption is basically everywhere, protecting national secrets and banking, website visits, automotive software updates, and Internet of Things (IoT) sensor communication. Some applications will require data to be re-encrypted to a new standard. Others, such as IoT applications, will, worst case, need to be replaced with new PQC-capable hardware. Of course, this is only part of the challenge, and these applications almost always depend on key exchange, confidentiality, integrity and peer authentication.
The Post-Quantum Cryptography Landscape
The selection of PQC algorithms is the culmination of decades of considerable effort by the world’s leading cryptographers to devise and vet digital signature, key exchange, and encryption methods resistant to attacks by future quantum computers. This is an incredibly challenging task since quantum computing and quantum algorithms are still in their infancy, so it is not entirely clear how they will evolve.
The first task has been to find and optimize mathematical problems from new domains that are sufficiently intractable to quantum computers. Another task has been to consider the practical aspects of using the algorithms in the massive array of applications where cryptography is used. Giant key sizes are one possible option, but these are unsuited to the simplest electronic systems that have just a few kilobytes of memory. Then there are the practicalities of performing the necessary calculations. Secure links by battery-operated devices need algorithms that can be implemented and executed using low-power (microwatt) approaches.
While working on the Crystals-Kyber algorithm, the NXP and IBM teams have found a solution that is operationally fast and uses comparatively small cryptographic keys, making it easy for two parties to exchange them. The algorithm is a key-encapsulation mechanism based on hard mathematical problems over so-called “module lattices”. When performing the necessary calculations in software, processors can primarily rely on additions, multiplications, and Fourier transforms, all of which are common to today’s applications. It can also be shown that some of the existing deployed cryptographic hardware accelerators can be reused to aid the computation of this new post-quantum secure algorithm.
Building PQC Solutions
It’s been already shown that today’s microcontrollers have the performance to support Crystals-Kyber. This is a critical first step since microcontrollers have significantly less performance and memory than the processors used in servers, laptops and smartphones, but will remain the mainstay of all other Internet-connected electronic systems, from vehicles and sensors to bank cards and industrial equipment. Semiconductor vendors, such as NXP, are determining how much of existing cryptography technology, such as accelerators deployed in today’s secure elements (SE) and trusted platform modules (TPM), can be reused.
Beyond implementing the cryptographic algorithms, there is also the issue of standardizing how systems communicate with one another when using them. This ranges from certificates to key exchange and defining secure methods that don’t leak information or provide backdoors for adversaries across a vast number of use cases.
Transitioning to PQC Algorithms
With the competition winners announced, many security experts will be keen to prepare for a PQC world – and this is definitely the correct attitude. Large-scale quantum computers, once online, will leave us with broken legacy keys that are not patchable. Initially, it makes sense to audit where legacy keys and cryptographic algorithms are in use as the chosen algorithms become formal standards over the next two years. During that time, teams can start to explore these new algorithms, determine which are most appropriate to their use case, start testing them and determine the appropriate migration path. There is also the wait for hardware security devices as they are developed and deployed into servers, laptops, smartphones and smartcards.
It is then the responsibility of Charter of Trust members, such as NXP and IBM, to keep both the security community, engineering organizations, and the wider public informed of the pending cryptography transition. Together, as an expert community, we are exceptionally well placed to spearhead the necessary campaigns and share our knowledge as we enter this exciting world of PQC.
If you would like to learn more about more about NXP’s and IBM’s development in post-quantum cryptography, please access the following links:
NXP The Emergence of Post-Quantum Cryptography
NXP Post-Quantum Cryptography
IBM Quantum Safe
IBM’s latest mainframe, z16