In September 2017, this (legacy) site will be replaced with the new site you can see at beta.csrc.nist.rip. At that time, links to this legacy site will be automatically redirected to apporpriate links on the new site.

View the beta site
NIST Logo and ITL Banner Link to the NIST Homepage Link to the ITL Homepage Link to the NIST Homepage

Post-Quantum crypto standardization

Frequently Asked Questions

  1. The call for proposals briefly mentions hybrid modes that combine quantum-resistant cryptographic algorithms with existing cryptographic algorithms (which may not be quantum-resistant). Can these hybrid modes be FIPS-validated?
  2. What are NIST’s plans regarding stateful hash-based signatures?
  3. What is the rationale for the NIST decision to limit both the required reference implementation and the required optimized implementation to ANSI C source code? Are there any exceptions that allow for the use of other versions of C, C++, or assembly optimizations?
  4. Will NIST consider platforms other than the “NIST PQC Reference Platform” when evaluating submissions?
  5. In Sections 4.A.2 and 4.A.4, NIST’s CFP sets the number of decryption (resp. signature) queries, that an attacker against a proposed encryption (resp. signature) scheme can make, to at most 2 to the 64. What is the rationale for not letting the adversary make essentially as many queries as the target security?
  6. What does NIST consider to be an acceptable rate of decryption/decapsulation failure?
  7. Is the NIST PQC Standardization Process a competition?
  8. Why does NIST’s CFP ask submitters to provide a classical security analysis, when the intent is to plan for a world with quantum computers?
  9. In section 4.A.5, it is stated that NIST will assume that its 5 security categories are correctly ordered (i.e. that a brute force collision attack on SHA256 (resp. SHA384) will be harder to perform than a brute force key search attack on AES192 (resp. AES 256.)) How realistic is this assumption?
  10. How can submitters who aren’t experts in quantum cryptanalysis set their parameters?
  11. What will happen to a submitted algorithm if some or all of the provided parameters fail to meet their claimed security strength categories?
  12. Which security strength categories will NIST consider for standardization?
  13. What are the “standard conversion techniques” NIST will use to convert between public-key encryption schemes and KEMs?
  14. NIST provided APIs and security definitions for Public Key encryption, KEM, and digital signature. Why are other functionalities not included?
  15. How does a submission obtain secure randomness?
  16. Can third party open-source code be used in submissions?
  17. How should submitters choose symmetric algorithms for their submissions?
  18. What are the definition of the terms inventor and owner as used in the CFP?
  19.  

A1: Assuming one of the components of the hybrid mode in question is a NIST-approved cryptographic primitive, such hybrid modes can be approved for use for key establishment or digital signatures. In particular, a hybrid mode for signatures consists of two signatures. The mode is valid if and only if both signatures are valid.  FIPS 140 validation can only validate the part of the hybrid signature which is currently approved by NIST. Similarly, a hybrid key establishment scheme derives keying material from two or more secret values established by different key establishment primitives. Only the NIST approved key establishment primitive can be validated according to FIPS 140.  In any case, such validation is only certifying that the NIST-approved portion is correctly implemented and used, and it says nothing about the security of the quantum-resistant portion of the hybrid mode. Hybrid modes may be an initial step for the migration to post-quantum primitives. However, NIST continues to believe that the long term solution to the threat of quantum computers is to provide standards for post-quantum public key cryptography, through the process outlined in our call for algorithms.

Back to Top

A2: NIST plans to coordinate with other standards organizations, such as the IETF, to develop standards for stateful hash-based signatures. As stateful hash-based signatures do not meet the API requested for signatures, this standardization effort will be a separate process from the one outlined in the call for proposals. It is expected that NIST will only approve a stateful hash-based signature standard for use in a limited range of signature applications, such as code signing, where most implementations will be able to securely deal with the requirement to keep state.

A3: NIST understands that real-world cryptographic algorithm implementations will necessarily contain platform-specific optimizations. The two required implementations in the submission package are primarily intended to facilitate future analysis and development throughout the evaluation period, and as such, we require that both be written in a cross-platform manner. Additionally, the two required implementations need not be distinct.  If a submitter does not see value in a separate cross-platform optimized implementation, they may simply note in their submission that the reference implementation is also the cross-platform optimized implementation.

Regarding the ANSI C requirement, submitters should note that key requirements are that the submission code should be written in a cross-platform manner and that the submission must contain build scripts or instructions for version 6.4.0 of the GNU Compiler. In particular, mandatory implementations written in C99 and C11 are both perfectly fine, as long as any necessary compiler directives are included as part of the build script(s).

Additionally, implementations that use NTL (see Question and Answer 16 for details on the use of third-party open source libraries) are necessarily allowed to be written in C++, although to ease portability to a pure C implementation via swapping NTL for C-based libraries, we ask that the original and new code in this submission be as ANSI C-like as possible, only using C++ functionality where absolutely required in order to interact with NTL. 

Submitters may not write their own new and original assembly (including inline assembly) code or compiler intrinsics for either the mandatory referenced implementation or the mandatory optimized implementation but may use third party open-source libraries that themselves rely on assembly optimizations, subject to the constraints described in Question and Answer 16.

During the course of the evaluation process, NIST will be looking at performance data for the best-available implementations a variety of platforms. As such, we strongly encourage submitters to include optimized versions for major platforms- particularly x64, and 32-bit and 64-bit ARM architectures.  However, we have made such submissions optional so as not to discourage submissions from teams that may have very strong algorithmic candidates, but have little experience in the area of platform optimization. For further questions on platform-specific optimizations and the role they will play in NIST’s evaluation process, see Question and Answer 4.

A4: The reference platform was defined in order to provide a common and ubiquitous platform to verify the execution of the code provided in the submissions.  

The reference platform should be treated as a single core machine, but if an algorithm can make particular use of multiple cores or vector instructions, submitters are encouraged to provide additional implementations for these platforms.

In our evaluation process, NIST plans to include performance metrics from a variety of platforms, including: 64-bit “desktop/server class,” 32-bit “mobile class,” microcontrollers (32-, 16-, and where possible, 8-bit), as well as hardware platforms (e.g., FPGA). Submitters are strongly encouraged to provide additional implementations for these platforms, but to avoid discouraging submissions from teams with strong candidate algorithms but little experience in the area of platform-specific optimizations, NIST is making them optional as part of the submission itself. 

NIST expects that as the evaluation process moves beyond the first round, we will see the wider cryptographic community (in particular, those skilled in platform-specific optimizations) provide optimized implementations for most submissions for a wide variety of platforms, as was the case in the SHA-3 competition. NIST plans to use such third-party optimized implementations and third-party benchmarking tools such as eBaCS/ SUPERCOP and Open Quantum Safe as part of its evaluation process.

A5: Our reason for primarily considering attacks involving fewer than 2 to the 64 decryption/signature queries is that the number of queries is controlled by the amount of work the honest party is willing to do, which one would expect to be significantly less than the amount of work an attacker is willing to do. Any attack involving more queries than this looks more like a denial of service attack than an impersonation or key recovery attack. Furthermore, effectively protecting against online attacks requiring more than 2 to the 64 queries using NIST standards would require additional protections which are outside the scope of the present post-quantum standardization effort, most notably the development of a block cipher with a block size larger than 128 bits. This may be something NIST pursues in the future, but we do not feel it is necessary for addressing the imminent threat of quantum computers. That said, as noted in the proposed call for algorithms, NIST is open to considering attacks involving more queries, and would certainly prefer algorithms that did not fail catastrophically if the attacker exceeds 2 to the 64 queries.

Back to Top

A6: NIST did not provide an explicit limit on the rate of decryption/decapsulation failure. In cases where a scheme is targeting chosen ciphertext security, decryption/decapsulation failures may pose a security threat. A failure rate sufficiently high to violate the claimed security of a scheme is, of course, unacceptable. If, on the other hand, there is a strong argument that decryption/decapsulation failures do not pose a security threat, then the decryption/decapsulation failure rate becomes simply one among many performance considerations. NIST does not wish, at this time, to prejudge what performance considerations are important, and will therefore leave it up to submitters to provide performance characteristics that they feel will be most useful for the applications they think best fit their schemes.

A7: This process shares many features with NIST competitions, and is modelled after the successes we have had with competitions in the past.  There are, however, some important requirements that the current research climate demands we require for this process which constitute significant distinctions between this process and a competition.

First, our handling of the applicants does not coincide with a competition as specified in NISTIR 7977, nor does this process correspond to multiple parallel competitions.  There will not be an appropriate or directly analogous concept of “winners” and “losers.”  Our intention is to select a couple of options for more immediate standardization, as well as to eliminate some submissions as unsuitable. There will likely be some submissions that we do not select for standardization, but that we also do not eliminate and which may be excellent options for a specific application that we're not ready or don't have the contemporaneous resources to standardize.  In such a circumstance, we would communicate with the submitters to allow these to remain under a public license for study and practice and to remain under consideration for future standardization.  There is no specification for the handling of such an applicant in a competition.
           
Second, the state of the science in the competitions of the past, i.e. for the AES and SHA-3 competitions, was far more developed than for post-quantum cryptography.  Though differences of opinion are inevitable, the selection of the past winners should not have been too surprising.  The situation in post-quantum cryptography is less clear and opinions of required properties are less unanimous.  In addition, some of NIST’s selection criteria, particularly regarding post-quantum security, may need further refinement in response to ongoing research.
In many respects, the PQC standardization process is less like a competition, and more like an “analysis of alternatives.” The goal of the process is not primarily to pick a winner, but to document the strengths and weaknesses of the different options, and to analyze the possible tradeoffs among them. In the end, even if there is not a final consensus on what constitutes the best option, NIST expects that it will be able to make some selections that most experts will agree are satisfactory.

Back to Top

A8: Classical cryptanalysis is still valuable for a number of reasons. First, classical computers are not going away. For algorithms not subject to dramatic quantum attacks, such as those involving Shor’s algorithm, NIST believes that classical measures of security will continue to be highly relevant. Currently envisioned quantum computing technologies would be orders of magnitude slower and more energy intensive than today’s classical computing technology, when performing the same sorts of operations. In addition, practical attacks typically must be run in parallel on large clusters of machines, which diminishes the speedup that can be achieved using Grover’s algorithm. When all of these considerations are taken into account, it becomes quite likely that variants of Grover’s algorithm will provide no advantage to an adversary wishing to perform a cryptanalytic attack that can be completed in a matter of years, or even decades. As most quantum attacks on proposed postquantum cryptosystems have involved some variant of Grover’s algorithm, it may be the case that the best attack in practice will simply be the classical attack.

Also, the science involved in assessing classical security is better developed than that for assessing post-quantum security, and there is a larger community of researchers who can contribute to these investigations, increasing our confidence in the security of the proposed cryptosystems. Finally, classical cryptanalysis can improve our understanding of the mathematical structures underlying these cryptosystems, which is also the basis for quantum cryptanalysis.

A9: Even assuming no disparity in the cost of quantum and classical gates, NIST estimates that the assumption holds as long as the adversary is depth limited to fewer than about 287 logical quantum gates. This is quite near the limit of what NIST considers to be a plausible technology for the foreseeable future.

A10: Security strengths 1, 3, and 5 are defined in such a way that they are likely to be met by any scheme that:

      • Provides classical security strength of 128, 192, and 256 bits, respectively, AND
      • Is not subject to quantum attacks, other than classical attacks sped up by generic techniques (Grover’s algorithm, quantum walks, amplitude amplification etc.)

Security strengths 1,3, and 5 are unlikely to be met by any scheme with less than 128, 192 or 256 bits of classical security, respectively. This is not however an explicit requirement: At least for categories 3 and 5, NIST is open to classifying parameters with less classical security in these categories, given a sufficiently compelling argument demonstrating that:

      • With any plausible state of future technology where AES 192 (resp. AES 256) is near the limit of what can be broken by brute force, the most practical brute force attack against AES192 (AES 256 resp.) is not going to be the classical attack.
      • Given any such plausible state of future technology, the most practical attack against the parameters provided will be less practical than the most practical brute force attack against AES192 (resp. AES 256.)

Security strengths 2 and 4 are defined in such a way that they offer the maximum possible quantum security strength that can be offered by a scheme that only has a classical security strength of 128 or 192 bits, respectively. They will generally be easier to meet with parameter sets offering more classical security. A detailed quantum security analysis will be required to determine whether a parameter set meets these security strengths (unless the parameter set also meets the criteria for the next higher security strength). 

Back to Top

A11:  NIST will not remove a scheme from consideration just because it was submitted with incorrectly analyzed parameters. Depending on how far off the estimate was, and how unanticipated the attack, NIST may take it as a sign the algorithm isn’t mature enough, which could lead NIST to remove the scheme from consideration. However, assessments of an algorithm’s maturity will not be primarily based on security strength categories. Rather, the point of the categories is to compare like with like when doing performance comparisons and to make it easier to plan crypto transitions in the future. NIST will respond to attacks that contradict the claimed security strength category, but do not bring the maturity of the scheme into question, by bumping the parameter set down to a lower category, and potentially encouraging the submitter to provide a higher security parameter set.

A12: For any scheme selected for standardization, NIST hopes to select parameters sets from those offered by the submitter. If the submitted parameter sets fail to meet NIST’s needs, for whatever reason, NIST hopes to work with the submitter to provide parameter sets that do meet NIST’s needs. NIST may also choose not to standardize some of the submitted parameter sets. NIST’s reasons for doing this could include insufficient security, unacceptable performance, and too many parameter sets.

NIST has numerous reasons for specifying a categorical post-quantum security hierarchy in the Call for Proposals for post-quantum standards.  The primary purpose is to facilitate the comparison of submissions achieving specific benchmark security levels so that an honest assessment can be made.  Due to the fact that the science is not yet fully developed in this area, it is possible and appropriate for these benchmarks to be refined in response to future advances in theory.  It is not NIST’s intent to unfairly review submissions based on an analysis on parameter sets we learn to be un-impactful.

It is, however, NIST’s present belief that all five of the security strength categories provide sufficient security to allow for standardization. More precisely, NIST would describe security strengths 4 and 5 as “likely excessive,” 2 and 3 as “probably secure for the foreseeable future,” and security strength 1 as “likely secure for the foreseeable future, unless quantum computers improve faster than is anticipated.” The only security considerations which are likely to lead NIST to decline to standardize a parameter set for a scheme NIST has selected are

    1. NIST may assess the parameters as having insufficient security strength for any of the five categories
    2. NIST may assess the parameters as having too little security margin to compensate for the expected uncertainty in attack complexity. And,
    3. NIST may decide, based on technological developments during the evaluation process, that one or more of the security strength categories provides insufficient security. As each security category is defined to be at least as secure as an already-standardized, reference primitive, NIST would signal its uncertainty regarding the security of the category by announcing plans to deprecate or withdraw the reference primitive. For example, if NIST were to signal that parameters in category 1 may provide insufficient security, it would do so by announcing plans to deprecate or withdraw AES128. NIST has not done this, and does not expect to do so during the evaluation process.

NIST may also decline to standardize parameters which have unacceptable performance. If NIST feels the higher security strength categories cannot be met with acceptable performance, NIST may encourage the submitter to provide parameters with intermediate security between security strengths 2 and 3, or between 3 and 4.

Finally, NIST may pare down the range of options offered by the submitter, regarding how to select parameters. Flexibility is generally a good thing, but it may be weighed against the complexity of implementing and testing for all available options.

Back to Top

A13: To convert a public key encryption function to a KEM, NIST will construct the encapsulate function by generating a random key and encrypting it. The key generation and decapsulation functions of the KEM will be the same as the key generation and decryption functions of the original public key encryption scheme. To convert a KEM to a public key encryption scheme, NIST will construct the encryption function, by appending to the KEM ciphertext, an AES-GCM ciphertext of the plaintext message, with a randomly generated IV. The AES key will be the symmetric key output by the encapsulate function. (The key generation function will be identical to that for the original KEM, and the decryption function will be constructed by decapsulation followed by AES decryption.)

Back to Top

A14: NIST is looking primarily to replace quantum-vulnerable schemes with functionalities that are widely used, have widely agreed upon security and correctness definitions in academic literature, and for which there appear to be a range of promising approaches for designing a postquantum replacement. NIST considered a number of other functionalities, but did not provide explicit support for them, since it did not feel they met the above criteria as well as encryption, KEM, and signature. In many cases, NIST expects that schemes providing some of these functionalities may be submitted as a special case or an extension of one of the functionalities we explicitly asked for. In such a case, any additional functionality would be considered an advantage as noted in section 4.C.1 of our Call for Proposals. Two particular functionalities NIST considered were authenticated key exchange (AKE), and a drop in replacement for Diffie-Hellman.

Diffie-Hellman is an extremely widely used primitive, and has a number of potentially useful special features, such as asynchronous key exchange, and secure key use profiles ranging from static-static to ephemeral-ephemeral. However, NIST believes that in its most widely used applications, such as those requiring forward secrecy, Diffie-Hellman can be replaced by any secure KEM with an efficient key generation algorithm. The additional features of Diffie-Hellman may be useful in some applications, but there is no widely accepted security definition of which NIST is aware that captures everything one might want from a Diffie-Hellman replacement. Additionally, some plausibly important security properties of Diffie-Hellman, such as a secure, static-static key exchange, appear difficult to meet in the post-quantum setting. NIST therefore recommends that schemes sharing some or all of the desirable features of Diffie-Hellman be submitted as KEMs, while documenting any additional functionality.

AKE is also a widely used functionality. However, NIST would consider it a protocol rather than a scheme. This is an important distinction, because most widely used AKE protocols are constructed by combining simpler primitives, like digital signature, public key encryption, and KEM schemes. NIST wants to leave open the possibility that standards for these schemes may come from different submitters. Additionally, the security definitions for AKE are significantly more complicated and contentious than those for the functionalities NIST is explicitly asking for in its call for proposals. NIST recognizes that there are some AKE functionalities, in particular implicitly authenticated key exchange (IAKE), that cannot easily be constructed from simpler components. While it is less natural to treat IAKE schemes as an extension of the KEM framework, than it is for Diffie-Hellman-like primitives, NIST does believe that it can be done in most cases. For example, a significant part of the functionality of a 2-message IAKE protocol could be demonstrated by treating the initiator’s public authentication key as part of a KEM public key, and the responder’s public authentication key as part of the KEM ciphertext.

Back to Top

A15:The function randombytes() will be available to the submitters. This is a function from the SUPERCOP test environment and should be used to generate seed values for an algorithm.

For functional and timing tests a deterministic generator is used inside randombytes() to produce the seed values. If security testing is being done simply substitute calls to a true hardware RBG inside randombytes().

Function prototype for randombytes() is:

// The xlen parameter is in bytes
void randombytes(unsigned char *x,unsigned long long xlen)

The following demonstrate the use of the KAT and non-KAT versions of the functions to generate a key pair for encryption:

int crypto_encrypt_keypair_KAT(
              unsigned char *pk,
              unsigned char *sk,
              const unsigned char *randomness
         )

int crypto_encrypt_keypair(unsigned char *pk, unsigned char *sk)
{
      unsigned char pk[CRYPTO_PUBLICKEYBYTES];
      unsigned char sk[CRYPTO_SECRETKEYBYTES];
      unsigned char seed[CRYPTO_RANDOMBYTES];

      randombytes(seed, CRYPTO_RANDOMBYTES);
      crypto_encrypt_keypair_KAT(pk, sk, seed);
}

A16: In both the mandatory reference implementation and the mandatory optimized implementation, submissions may use NTL Version 10.5.0 (http://www.shoup.net/ntl/download.html), GMP Version 6.1.2 (https://gmplib.org), the Keccak code package (https://github.com/gvanas/KeccakCodePackage), and OpenSSL Version 1.10f (https://www.openssl.org/source). Submitters may assume that these libraries are installed on the reference platform and do not need to provide them along with their submissions.  

If a submitter wishes to use a third-party open source library other than the ones specified above, they must send a request to NIST at pqc-comments@nist.govby September 1st, 2017, with the name of the library and a link to the primary website hosting it from which it may be downloaded. NIST will either approve or deny this request within 2 weeks of receiving it. Should a request be approved, it will be added to the above list of acceptable third-party open source libraries provided in this FAQ.

All submission packages using third-party open source code should contain build scripts which will allow for seamless “one-stop” building of the submissions.

For example, on a Linux platform, it should require no more work to build the than running the standard

> ./configure [--options]
> make
> make install

succession of commands. In particular, the build process should be able to find the versions of these libraries specified above that will be pre-installed on the reference platform.

Separate build scripts should be included for the reference Windows platform and reference Linux platform that work using the GNU Compiler Collection version 6.4.0 and related tools as well as any platform-specific commands required. 

In addition, as part of the written submission, the submitter shall describe in their own words the functionalities provided by any algorithms from third-party open-source libraries that are used in the implementations. 

Back to Top

A17: While NIST will permit submitters to choose any NIST approved cryptographic algorithm for their submission if they feel it is necessary to achieve the desired security and performance, a number of potential submitters have asked us to offer default options for common symmetric cryptographic primitives. As such, here are our suggestions:

  1. Hash functions: SHA512 is likely sufficient to meet the requirements of any of our five security strength categories and gives good performance in software, especially for 64 bit architectures. Submitters seeking a variable length output, good performance in hardware, or multiple input strings, may instead prefer to use TupleHash256 (specified in SP 800-185.)
  2. XOFs: We would recommend SHAKE256
  3. Authenticated encryption: We'd suggest AES256-GCM with a random IV.
  4. PRFs: Where security proofs can accommodate something that is not indifferentiable from a random oracle, John's AES-based seed-expander will offer excellent performance. Otherwise, KMAC256 (specified in SP 800-185) will be a good choice.

Also recall, from the CFP: "If the scheme uses a cryptographic primitive that has not been approved by NIST, the submitter shall provide an explanation for why a NIST-approved primitive would not be suitable."

A18: An inventor is whoever conceived of the algorithm described and implemented in the submission.  If more than one person conceived of the algorithm, the algorithm will have been invented by co-inventors. 

An owner of the algorithm is the inventor unless and until the inventor assigns (i.e., transfers ownership) the algorithm to another.  In the case of co-inventors, the co-inventors jointly own the algorithm, and each co-inventor may assign their individual ownership interest in the algorithm.  The algorithm may be claimed in a patent or a patent application, and the patent or patent application can be assigned to transfer ownership of the patent or patent application to an assignee.

The implementation of the algorithm may be subject to copyright, and the owner of the copyright in the implementation of the algorithm is initially the author or authors of the implementation.  Authors of a joint work (i.e., an implementation made by more than one author) are co-owners of copyright in the implementation.  The employer or other person for whom the implementation was prepared may own all rights comprised in the copyright of the implementation.

Each submission must include signed statements by the submitter(s), patent (and patent application) owner(s), as well as the reference/optimized implementations’ owner(s).  If an algorithm or implementation is put into the public domain, we still require the signed statements from the submitter(s) and owner(s) exactly as specified in Section 2.D of the Call for Proposals.


Back to Top