Score:1

Classical vs Quantum-Safe Algorithms and Protocols / Application Approach

us flag
Moo

Various proposals are being explored for X509 V3 certificates in a Post Quantum Cryptography (PQC) world.

Currently, these include just having a certificate for classical and PQ, having a hybrid certificate for classical and PQ using X509 extensions or composite certificates that concatenate as many signatures into one blob as needed. If I understand correctly, the first two approaches are an or function, that is, you use either the classical or post-quantum signature only. The Composite approach can use all the signatures, so users would have to validate all signatures used in the blob.

Is this understanding, correct?

The question I have is when you get to the protocol level. Crypto agility is defined as the ability of a security system to be able to rapidly switch between algorithms, cryptographic primitives, and other encryption mechanisms without the rest of the system’s infrastructure being significantly affected by these changes.

How is industry considering TLS, for example? Is there a Classical TLS and a PQC TLS? Are there just two variants of the protocol, which seems to violate crypto agility? If there are two, what is needed to switch between them in a real instantiation?

Another approach is to have a hybrid TLS that allows selection of classical or PQC algorithms in the negotiation. However, how many years would such a variant take to create by the standards body as this greatly increases the complexity, but maybe that is the point. Note that I am aware of the Open Quantum Safe (https://openquantumsafe.org/) projects, but they all appear to create a PQ variant only. Also, at some point, you would sunset the classical algorithms and only support the PQ ones. How would such a transition happen?

The same sorts of questions can be asked about SSH, OpenSSL, gnu utils and any other protocol or application that uses classical vs quantum-safe algorithms.

Are these things that just have not been worked out and it is going to take years for the protocol and application developers to hold working group meetings to decide on how this will be done technically, logistically and the like?

Score:1
my flag

Various proposals are being explored for X509 V3 certificates in a Post Quantum Cryptography (PQC) world.

Is this understanding, correct?

In my view, there are essentially two way to have both classical and postquantum with certificates:

  • Have a 'hybrid' (or composite or whatever terminology you feel comfortable with) certificate that somehow contains both classical and postquantum signatures and public keys. There are several ways proposed how to do this, which have technical differences but don't really matter at this high level. As for the AND/OR controversy, my opinion is: you always check all the signatures you know how to check, and reject it if any fail (that is, AND) - the only question that comes up is if you run into a public key whose type you don't understand...

  • Alternatively, we can have two separate certificates, one purely classical, and one purely postquantum; we'd modify the protocol using certificates to ask for both certificates, and use both public keys. The advantage of this is that postquantum certificates are likely to be considerably longer; we would not be adding expense to someone who doesn't yet support postquantum yet.

I haven't seen any consensus as to which of these two options make more sense; I would not be surprised if different usages of certificates may take different approaches.

The question I have is when you get to the protocol level. Crypto agility is defined as the ability of a security system to be able to rapidly switch between algorithms, cryptographic primitives, and other encryption mechanisms without the rest of the system’s infrastructure being significantly affected by these changes.

How is industry considering TLS, for example?

I will assume that this part is talking about the privacy aspect, rather than the authentication.

As for TLS 1.3 (and currently there appears to be little reason to revise the earlier versions of TLS), it's pretty mostly worked out: we'll add additional 'named groups' [1], which are the key exchange algorithm used. In addition to the existing one (such as X25519), we would add one called 'NTRU' (which would list the NTRU parameter set), as well as 'X25519+NTRU'. What the latter would do is implement both X25519 and NTRU in parallel; the key shares will be the X25519 and the NTRU key shares concatenated, and the shared secret is the X25519 and the NTRU key shares concatenated (TLS 1.3 has a strong KDF, and so concatenation works well). This is detailed in this IETF draft

This makes the upgrade path easy; clients would propose the names groups X25519+NTRU and X25519; servers that understand postquantum would accept the first; servers that couldn't would fall back to the second. Then, when it comes time to discard classical, clients would then start proposing NTRU (only); the same upgrade path works there too.

I believe OpenSSH is going down a similar path (only with NTRUprime); however I haven't looked into the details to be certain.


[1]: 'named groups' is now a misnomer, as postquantum algorithms are not based on groups...

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.