FREQUENTLY ASKED QUESTIONS
Post-Quantum Cryptography
and NIST PQC Standards
Here, we try to answer the most commonly asked questions on PQC and related topics.
FAQs on PQC
What are quantum computers?
Quantum computers are based on physical principles that are fundamentally different from those underlying today’s classical computing. At a large scale, they would be capable of performing certain tasks much faster than today’s computers. While prototypes of small quantum computers, often designated by NISQ (Noisy Intermediate Scale Quantum computers), already exist, building large-scale reprogrammable ones is still at a very upstream research stage. There are many lines of research in quantum physics for the creation of such quantum computers but none of them is certain to succeed. Though their potential benefits are not fully known, they could obviously be high. Thus, the industry, governments, and academia around the world are devoting significant resources to research in quantum computing. For example, in 2021, the French government announced an investment of more than 1 billion euros in quantum technologies, including quantum computing [8]. A complete survey on the status of quantum computer development has been detailed by the BSI, the German counterpart of ANSSI [7].
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
Quantum threat: what is the impact on our current digital infrastructures?
The security of the majority of digital infrastructures relies on public key cryptography (PKC), a technology that enables secure communications between entities, typically users or servers, that do not share any pre-established secret. More precisely, PKC serves two main functionalities: protected channel establishment (key establishment) and authentication of digital information (including authentication of parties inside a communication protocol by the use of digital signatures). Today, these techniques are essentially based on two mathematical problems: the factorization of large numbers and the discrete logarithm computation. Both are dimensioned to be virtually impossible to solve with our current computing resources and mathematical knowledge. For example, the well-known RSA public key algorithm relies on the factorization of large numbers. These two fundamental problems will no longer be unsolvable if a large-scale quantum computer is built; thus, the security of currently deployed public key cryptography could collapse. Indeed, in 1994, P. Shor introduced a quantum algorithm [15]) able to solve these problems. This algorithm cannot be performed on classical computers, but it could be performed on large-scale ones. To avoid any confusion with the existing prototypes of small-scale quantum computers, the NCSC, the British counterpart of ANSSI, introduced the name of Cryptographically Relevant Quantum Computers, CRQCs (see the NCSC’s whitepaper on quantum-safe cryptography [12]). In other words, a CRQC is a quantum computer that is able to execute relevant instances of Shor’s algorithm, thus threatening today’s public-key cryptography. The prototypes of quantum computers that exist are presently far from the required scale and stability of CRQCs, and therefore, they are currently not a threat to public key cryptography. Many research challenges in physics, engineering, and computer science must be overcome before scaling up to large quantum computers that can solve the factorization and discrete logarithm problems on which the current PKC is based. However, the threat of retroactive attacks cannot be ruled out. A family of attacks, called “store now, decrypt later attacks”, constist in storing today the data exchange corresponding to a secure channel establishment and the encrypted messages over such a channel, with the purpose of eventually decrypting these messages once a CRQC becomes available. This threat could be relevant in certain scenarios involving very sensitive information, e.g., classified information. Furthermore, the quantum threat could also impact digital signatures. Indeed, a CRQC could be used to forge signatures and allow impersonation attacks. Unlike the threat of retroactive “store now, decrypt later attacks,” this threat would only be effective if the signatures are generated at a time when a CRQC exists. Thus, the signatures that are verified on the y, like in authenticated channels, cannot be directly impacted before the existence of a CRQC. However, in the context of document signing, long-term validity is sometimes required in certain specific use cases, and these signatures can be compromised by the existence of a CRQC. The transition from pre-quantum signatures to post-quantum ones should be tackled before the existence of any CRQC to avoid any a posteriori impersonation attack.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
Quantum threat: what about symmetric cryptography?
Symmetric cryptography, a different and complementary branch of cryptographic algorithms, could also be targeted by potential large-scale quantum computers. A generic quantum algorithm introduced by Grover in 1998 [11] quadratically speeds up the exhaustive search of secret keys in symmetric algorithms. Grover’s algorithm can also speed up certain attacks on hash functions called collision-finding attacks. These attacks also require the use of CRQCs, but for numerous algorithms, one can reasonably assume that they could be mitigated by adjusting the sizes of the hash outputs and keys (using 256 bits keys instead of 128 for the AES symmetric encryption mechanism and using 384 bits hash outputs for SHA-2 and SHA-3 for instance). Thus, the generic impact of Grover’s algorithm on symmetric cryptography is far more limited than the impact of Shor’s algorithm on PKC.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
Why the quantum threat should be taken into account today?
Because of the “store now, decrypt later” attack, the quantum threat should be taken into account before the question of whether the development of a CRQC will ever become achievable in the future is cleared up. Thus, a profound change in today’s public key cryptography towards quantum-resistant algorithms should be globally initiated to anticipate a possible collapse of our current cryptographic infrastructure. Even though protecting our current public key cryptography against this distant threat has a cost, researching alternative cryptographic solutions can be beneficial from another perspective. Indeed, beyond the quantum threat, cryptography is never infallible; weaknesses are found from time to time, even in cryptographic mechanisms implemented in classical computers. Thus, it is not possible to totally rule out a discovery of a potential weakness impacting the security of hardware and/or software and requiring the replacement of algorithms. Nowadays, the public key cryptography used worldwide is close to a monoculture and would strongly benefit from the introduction of new alternative algorithms.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
Is quantum key distribution a solution?
Quantum Key Distribution (QKD), sometimes called quantum cryptography, is a way of enabling secure communications without being vulnerable to classical and quantum computers by using quantum channels. Nevertheless, this technique does not provide a complete functional equivalent to public key cryptography and has limited applications due to the need for a dedicated communication infrastructure and without real routing capabilities. More information on the ANSSI position can be found under reference [5]. As such, except for niche applications where QKD provides some extra physical security on top of algorithmic cryptography (and not as a replacement), ANSSI does not consider it a suitable countermeasure to mitigate the quantum threat.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
What is post-quantum cryptography?
Post-quantum cryptography (PQC) is a family of cryptographic algorithms, including key establishment and digital signatures, that ensures a conjectured (note: 1) security even against an attacker equipped with quantum computers. Post-quantum algorithms can be executed on classical computers with classical communication channels and thus can be deployed in existing infrastructures, unlike QKD. Besides, these algorithms are not only for use after a CRQC is built; they can also be deployed in anticipation. For ANSSI, PQC represents the most promising avenue to thwart the quantum threat. The international research effort on post-quantum cryptography accelerated in 2015 after NSA’s release advising to “shift to quantum-resistant cryptography in the near future [9]. In 2017, the National Institute of Standards and Technologies (NIST) started a standardization campaign to define standard post-quantum public key algorithms (key establishments and signatures). At the time of writing, this process is still ongoing [13] and is currently in its third round. Standards are expected to be published within the next one to four years. Contrary to other standardization campaigns where there was only one finalist, the post-quantum campaign will end up with several standards for different applications. This standardization process has acted as a catalyst, allowing a strong involvement of the international cryptography research community and focusing the analysis efforts on a restricted number of candidate algorithms while preserving the diversity of the underlying problems. This process has also broadened the analysis to various implementation use cases like embedded components.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
What are the different post-quantum algorithms?
The different families of post-quantum public key candidate algorithms are defined by the mathematical structure on which they are built. Nowadays, post-quantum public key algorithms are mostly based upon Structured or unstructured Euclidean lattices, Error-correcting codes, Isogenies between elliptic curves, Multivariate systems, and Hash trees. While the mathematical problems were introduced in the last century, the PQC algorithms are relatively recent. They offer various compromises between key size, signatures or ciphertext size, computational complexity, and security assurance. A technical survey of the algorithms and the underlying mathematical problems can be found in [10].
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
Are the future NIST standards mature enough to be implemented in security products?
Beyond the NIST objective to derive standards, the past three rounds of the NIST standardization campaign provide a variety of algorithms along with security analysis. Although this new post-quantum toolbox may seem handy for developers, the maturity level of the post-quantum algorithms presented to the NIST process should not be overestimated. Many aspects lack cryptanalytical hindsight or are still research topics, e.g., analysis of the difficulty of the underlying problem in the classical and quantum computation models, dimensioning, integration of schemes in protocols, and, more importantly, the design of secure implementations. This situation will probably last sometime after the publication of NIST standards.
Acknowledging the immaturity of PQC is important: ANSSI will not endorse any direct drop-in replacement of currently used algorithms in the short/medium term. However, this immaturity should not serve as an argument for postponing the deployments. ANSSI encourages all industries to initiate in the next months a gradual overlap transition in order to progressively increase trust in the post-quantum algorithms and their implementations while ensuring no security regression as far as classical (pre-quantum) security is concerned.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
How to transition smoothly from pre-quantum to post-quantum algorithms?
A hybrid mechanism (key establishment or signature) combines the computations of a recognized pre-quantum public key algorithm and an additional algorithm conjectured post-quantum secure. This makes the mechanism benefit both from the strong assurance of the resistance of the first algorithm against classical attackers and from the conjectured resistance of the second algorithm against quantum attackers. Certain hybrid protocols are in standardization processes, like [17] for TLS 1:3 or [18] for IKEv2. More generally, for key establishment, one can perform both a pre-quantum and a post-quantum key establishment and then combine both results, e.g., using a Key Derivation Function (KDF). Alternatively, one may use for some specific applications a KDF on a pre-shared key and a shared key obtained from a pre-quantum scheme. For signature schemes, hybrid signatures can be achieved with the concatenation of signatures issued by a pre-quantum and a post-quantum scheme and the requirement that both signatures be valid in order for the hybrid signature to be valid. Even though hybridization is a relatively simple construction, ANSSI emphasizes that the role of hybridization in cryptographic security is crucial and will be mandatory for phases 1 and 2 presented in the sequel. In addition, the implementation security of the hybridization technique should also be taken into consideration. Given that most post-quantum algorithms involve message sizes much larger than the current pre-quantum schemes, the extra performance cost of a hybrid scheme remains low in comparison with the cost of the underlying post-quantum scheme. ANSSI believes that this is a reasonable price to pay for guaranteeing an additional pre-quantum security, at least equivalent to the one provided by current pre-quantum standardized algorithms.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
What is cryptoagility?
As detailed in the sequel, the deployment of hybrid PQC is not a mandatory feature as of today. However, ANSSI will encourage the initiation of progress towards cryptoagility as much as possible for future products. More precisely, a product is said to be crypto-agile if it includes the possibility to update its cryptographic algorithms without recalling it or substituting it with a new one. The quantum threat makes cryptoagility particularly relevant, and beyond this threat, classical attacks may also evolve and make cryptographic mechanisms or key lengths obsolete. In practice, cryptoagility also means that in addition to the possibility of patching, products could include an extra surface for allowing potential updates in order to react to upcoming cryptographic recommendations and standard updates. Even though updates of the cryptographic algorithms should be much less frequent than patches, the cryptoagility feature is non-trivial to implement due to the need for retro-compatibility and the potential requirement for additional security visas if the product is certified. However, as the motivation for cryptoagility is very relevant nowadays, ANSSI believes that cryptoagility features should be taken into account during the benefit/risk analysis of future products.
Source: https://cyber.gouv.fr/en/publications/anssi-views-post-quantum-cryptography-transition
FAQs on adopting the NIST algorithms
Why is NIST’s finalization of PQC standards significant for organizations today?
NIST’s finalization of PQC standards is crucial because it provides a clear, trusted path for organizations to start transitioning to quantum-resistant cryptography, ensuring long-term security as quantum computing becomes more capable.
What exactly are the new NIST post-quantum cryptography standards?
The new NIST post-quantum cryptography standards include FIPS 203, FIPS 204, and FIPS 205. These standards are designed to protect against the vulnerabilities that quantum computing could introduce to traditional cryptographic methods.
How soon should organizations implement these new PQC standards?
It’s advisable to begin implementation as soon as possible, given the unpredictability of quantum advancements and the potential risks associated with delaying the transition.
What are the risks of relying solely on traditional cryptographic algorithms in the face of quantum threats?
Since the mid-1990s, we have known of Shor’s Algorithm, which is a theoretical threat to Public Key Cryptography algorithms. Currently, quantum computers do not have the capability to bring this threat to life. However, with the significant increase in government and industry investment in quantum computing, this gap is rapidly closing, turning this potential threat from science fiction into a looming reality. The window for safeguarding our digital future against such quantum capabilities is narrowing, urging immediate action and preparedness.
Equally urgent is the vulnerability with similar mathematical foundations of today’s predominant Public Key Cryptography algorithms. This similarity poses a significant risk; should a new method of exploitation emerge in classic computing, it could compromise the security fabric of digital communication and our interconnected lives. Establishing a strong, new cryptographic standard and ensuring its widespread adoption is critical for protecting against both quantum and conventional threats on the horizon.
How do these new PQC standards impact existing PKI-based security systems?
The new PQC standards are designed to integrate with existing PKI systems, ensuring a smooth transition to quantum-resistant encryption without sacrificing interoperability or security.
Will the new PQC algorithms require significant changes to current IT infrastructure?
Yes, a considerable number of Information Systems and IT components must be updated to support the new PQC algorithms.
Some examples include,
Operating Systems and Browsers used on Workstations
Trusted Platform Module (TPM) on Laptops and Desktops
Application Servers and Server Components
Network components like Load Balancer, IDS/IPS, and others
Cryptographic libraries will require an update, too
PKI and HSM modules
The PQC transition must be viewed as a global IT migration project rather than a piecemeal ad hoc approach. Organizations should start with an inventory of their cryptographic assets and conduct a risk assessment to define the priorities and a migration calendar.
Can PQC algorithms be used alongside traditional cryptographic methods during the transition?
Yes, hybrid approaches can be implemented where PQC algorithms are used in conjunction with existing cryptographic methods, allowing a phased transition while maintaining current security levels.
How should organizations prioritize the transition to quantum-resistant cryptography?
Agencies advise starting by creating an inventory of cryptographic assets owned by the organization and conducting a risk assessment to define the priorities and a migration calendar. Then, the 3 FIPS algorithms, in combination with other new standards, will help improve the security of these assets and protect them against attacks based on quantum computers.
What are the expected challenges in transitioning to post-quantum cryptography?
Challenges include ensuring compatibility with existing infrastructure, managing the complexity of hybrid cryptographic environments, and staying ahead of evolving quantum threats while maintaining security and compliance.
What industries should prioritize the adoption of PQC standards?
Industries handling sensitive data, such as finance, healthcare, government, and critical infrastructure, should prioritize adopting PQC standards to protect against future quantum-based attacks.
Why is it crucial for organizations to adopt post-quantum cryptography now, even if quantum computers aren’t fully developed yet?
The “harvest now, decrypt later” strategy is a real threat. Rogue actors can intercept and store encrypted data today to decrypt it once quantum computing becomes available. This risk applies to data that remains valid over a long period. Stakeholders must gauge their data’s validity to evaluate whether it will be relevant beyond 2030.
While code signing for software used in firmware updates is nearly impossible to duplicate today, attackers might be able to build new firmware/drivers with forged signatures in the future, leaving the device vulnerable to data theft, lateral intrusion into the network, espionage-related attacks, or use as part of a bot network.
Document Signing carries a longer-term risk; it’s more of a run against time. If you don’t countersign or re-sign your documents before Quantum Computers arrive, an attacker could inject documents with forged signatures and falsify records.
How do FIPS 203, 204, and 205 differ in terms of their intended applications?
What is the significance of FIPS certification for these PQC algorithms?
It is a significant milestone that kickstarts the process of introducing the new algorithms into the global IT ecosystem, paving the way for all organizations to seamlessly migrate in the future. It establishes common standards for seamless interoperability between services and providers.
How does post-quantum cryptography affect compliance with current security regulations?
Adopting PQC standards ensures that organizations stay ahead of future regulatory requirements, as compliance standards are likely to evolve to include quantum-safe encryption in response to emerging threats.
When will Quantum Computers be available to break today’s Cryptography?
The timeline for developing a quantum computer varies widely within the community, ranging from 2027 to 2040 (or even never). However, the current general estimate is around 2030. There is still a significant amount of progress to be made in order to create a quantum computer of sufficient size and quality to effectively implement Shor’s Algorithm, which has the potential to break today’s Public Key Cryptography, such as RSA and ECC.
What are the risks of Symmetric Cryptography?
The Grover algorithm leverages Quantum Computer logic to break symmetric cryptography, though the gains are relatively low, around a 1 to 3 or 1 to 4 ratio. Experts suggest using slightly longer keys as a safeguard against this potential threat, but there’s no need to develop new algorithms for this purpose. It’s recommended to use AES-256 for encrypting operations and SHA2-384 for hashing operations.
Want to know more?
Write to us to schedule a personalized session with one of our PQC experts!